Author Archives

Hatim Rahman

Commentary, New book

Author Meets Critics: Engaging Hatim Rahman’s Inside the Invisible Cage

, and
May 20, 2025

The following is a loosely edited transcript of the Author Meets Critics event devoted to Hatim Rahman’s Inside the Invisible Cage (University of California Press, 2024). The event was held on April 8, 2025 and sponsored by Work In Progress. The actual hour-long video will be posted soon.

Inside the Invisible Cage provides an in-depth account of “TalentFinder,” the pseudonymous platform that has become the dominant provider of on-line freelancer services in the world. The book stands as the most important analysis of the mechanisms that crowdworking platforms use to control the behavior of the highly skilled contractors and consultants they attract.

Continue Reading…
Research Findings

For Gig Workers, Resistance Against Digital Bosses Is Not Futile

and
August 4, 2024

In the gig economy, the customer isn’t king. They’re the emperor.

Workers on Uber, Lyft, TaskRabbit, Instacart, Fiverr, Upwork, and other labor platforms are at the mercy of algorithms that funnel jobs their way based on customer ratings. The better ratings, the better the assignments. By using customers for control, the platforms have ditched traditional management and gained tremendous efficiencies for themselves and for their customers. Who doesn’t love the ease of calling a rideshare to their exact location with a few taps on a cellphone, compared with standing on a street corner to hail a cab?

Continue Reading…
Research Findings

Opaque algorithms are creating an invisible cage for platform workers


December 9, 2021

We live in a world run by algorithms. Nowhere is this more apparent than with platform companies, such as Facebook, Uber, Google, Amazon, and Twitter. Platforms claim that their algorithms collect and use our data to optimize our experience with breathtaking speed and efficiency. 

Recent reports from scholars, journalists, and policy makers, however, have revealed that platforms’ algorithms exacerbate bias and discrimination in ways that are difficult to audit. 

In my recent study of workers on a labor platform, I found a broader concern about the way platforms use algorithms to control participants. Platforms’ algorithms create an invisible cage for platform users, because workers have no way of reliably accessing how their data is being processed or used to control their success on the platform.

Continue Reading…