Everyone is beginning to grasp the reality of working conditions in the platform economy. We know that riders travel in scorching heat to deliver food and groceries, drivers work over 12 hours in gruelling city traffic, and help around the house is just a click away, thanks to the scores of daily wage workers available at a moment’s notice. Governments are taking cognizance, and regulation is slowly on the way, despite misgivings about its ability to truly hold digital labor platforms accountable. But when we talk of platform work, we rarely think of the labor being performed on social media sites. Of course, we are increasingly aware of data workers and content moderators, and their role in keeping social media platforms afloat and viable. But what of those people who produce content on social media?

The rise of the content creator economy has been astronomical over the last decade and a half. It is currently valued at around $250 billion and is projected to grow to nearly $500 billion by 2027. It can be argued that this ascent has been instrumental in the transformation of social media from a place of peer exchange to something resembling traditional media sources, with advertisements, partnerships, and promotions fronted by those who do the work of populating our social media feeds to keep us ‘engaged’. Social media platforms have been designed to hold our attention for as long as possible, because attention is their most valuable asset. Every mode of engagement–liking a post, watching a video, or pausing on an image–generates data that reveals our preferences and behaviors, all the better to bombard us with targeted advertisements. If we understand that social media is primarily an attention economy, then those who keep us attentive are doing the labor underpinning it.

If we understand that social media is primarily an attention economy, then those who keep us attentive are doing the labor underpinning it.

Work on social media is less systematic than data or crowd work, and less visible materially than on-demand platform work, occupying a liminal space in our popular imagination of digital work. This article will not discuss how social media content has distorted our culture, politics, and democracy. Rather, it switches up the lens to see what happens when we view social media as a workplace—how do content creators experience platformized working conditions? Who really profits from content creation on social media?

To start with, social media platforms perfectly encapsulate the problem of rentier capitalism in the digital age. In short, rentier capitalism has been described as “having, rather than doing” meaning value is extracted not through the production of goods and services that contribute positively to economic development, but rather through the ownership and control of critical infrastructures (in this case, platforms for creation and exchange of ‘content’ which are owned by a handful of Big Tech firms). Platforms like YouTube, TikTok, and Instagram appropriate and monetize user-generated content by functioning as intermediaries between creators and audiences, profiting from advertising, data analytics, and visibility algorithms. These platforms operate as ‘walled gardens’ where firms capture value by enclosing and controlling key nodes of digital interaction and innovation. This economic model effectively makes social media behemoths landlords of virtual real estate, who charge rent in the form of algorithmic visibility and access to monetisation opportunities. Thus, the rentier capitalist structure of social media platforms is characterized by asymmetrical power which gives platforms absolute ownership and control over a crucial resource, allowing them to extract value from the interactions between content creators and users. By doing so, it externalizes risk and precarity onto creators while consolidating control and profit.

This economic model effectively makes social media behemoths landlords of virtual real estate, who charge rent in the form of algorithmic visibility and access to monetisation opportunities.

This rentier logic produces new modalities of labor control, and the main tool of control, much like in most platforms, is algorithms. The bread and butter of content creators is visibility, followers, and engagement—the more they drive these metrics up, the more opportunities open up for brand partnerships, sponsorships, paid content, etc. For content creators, therefore, algorithms are not merely a technical feature of social media platforms—they are a defining condition of their labor since they define the conditions of visibility on the platform. Social media algorithms are extraordinarily sensitive and responsive–every click, like, and unlike, seconds spent looking at content, and skips are methodically and ruthlessly tracked, creating an instantly responsive feedback loop. These proprietary algorithms determine what type of content is made visible, promoted, or capable of monetisation, and also what type of content is suppressed, rejected, or rendered invisible, without offering creators meaningful insight into how these determinations are made. The algorithm becomes both boss and manager, but creators have no way of engaging with this mechanism. Their only option is compliance and obedience, with something that they view as an ‘agentic‘ creature (notwithstanding that algorithms are better understood as a sociotechnical process of organizing cultural work and consumption). This opacity and unknowability forces creators into a state of continuous strategic improvisation, where they must experiment, self-monitor, and revise their output in response to subtle, often inexplicable shifts in reach or engagement. For instance, content creators quickly realized that Instagram was privileging reels and videos over static posts, and quickly began to layer their text posts over random video backgrounds to ensure visibility in the new regime. Creators have been described as “always algorithmically aware”, calibrating their content on a day-to-day basis to satisfy the volatility and fleetingness of algorithmic visibility. When parsed through the lens of work, the algorithm’s mode of operation in effect amounts to arbitrary changes in working conditions, with no notice or transparency for workers. Furthermore, what algorithms suppress (called shadowbanning) is just as important as what the algorithm makes visible. Social media algorithms constantly demonetize, deprioritize, and remove content (sometimes without explanation). Content that goes against arbitrarily enforced community guidelines can even result in the suspension of creators’ accounts, which means that they are unreasonably cut off from their livelihoods. For many creators, particularly those producing content on sensitive issues (such as sexuality, race, politics, social justice, etc.), algorithmic moderation functions as a form of censorship, silencing voices on topics deemed unprofitable or controversial. This is a constraint not only on visibility and revenue, but also on the kinds of selves and subjects that can be made visible in social media feeds. In this way, algorithmic governance enacts a mode of labor discipline that is subtle, invisible, and deeply tyrannical, demanding constant adaptation while withholding any insight into its modes of operation.

When parsed through the lens of work, the algorithm’s mode of operation in effect amounts to arbitrary changes in working conditions, with no notice or transparency for workers.

One of the main features of social media is that there is something for everyone out there. The highly sensitive algorithm means that content has to be highly personalized. As previously stated, creators are the workers who ensure that what we see is relatable and that we keep going back for more (that is, we keep engagement consistent and high). However, what is hidden is the labor that content creators invest in performing a certain way of life that is ripe for consumption by users. This can be described as affective labor, or the work that involves creating, producing, or manipulating emotional experiences in others. For instance, it is not sufficient for a fashion blogger to simply make a video endorsing a pair of jeans. To sustain reach, engagement, and make themselves relatable to their followers, they would have to show how the pair of jeans fits into their daily life–showing themselves wearing it to a birthday party or out on a date. Mommy bloggers share the ups and downs of parenting, and record their days in and out (eg, “a day in the life” videos) to produce content that is evocative of their audience’s lived experiences. Wellness influencers have to embody a certain ‘lifestyle’ contingent on elaborate skincare routines, supplements, gruelling fitness regimes, etc. They are not only selling the products they use, but selling the idea that everyone can have this kind of ‘aestheticized’ life, if only we use what they do and do what they do. This type of content, therefore, is performance disguised as authenticity, while simultaneously being vilified as ‘frivolous’. In fact, producing this type of cyclical content consistently takes an enormous toll on the well-being of creators. They have to offer up the minutiae of their daily lives for public consumption, sacrificing their privacy (and often their families’ privacy too) in the process. This type of visibility also renders them prime targets of abuse and hate online, with influencers reporting that they often encounter relentless abuse from online trolls. At the other extreme, the deeply personable and relatable nature of their content also cultivates parasocial relationships with their audiences, to the extent that followers demand to know all the details about their personal lives and have often even landed up outside their homes, putting them at risk of serious harm to physical safety. These incidents underscore the vulnerabilities influencers face due to intense public scrutiny, and the blurred lines between their online personas and private lives.​ These are often written off as occupational hazards of being a public figure, but content creators are workers, and their occupational safety is now a matter of public interest.

In fact, producing this type of cyclical content consistently takes an enormous toll on the well-being of creators. They have to offer up the minutiae of their daily lives for public consumption, sacrificing their privacy (and often their families’ privacy too) in the process.

The value of the cultural work being done by content creators is undeniable. This article has only provided a flavour of what themes emerge when we begin to see social media sites as workplaces and creators as workers, rather than viewing them as simply sites for peer exchange—the latter is a relic of the bygone era of the early internet. Seen in this way, we see that content creators have also not been spared the platformized precarity that affects other digital workers.

Crucially, the affective and relational labor that creators perform must be recognized as real work, deserving of rights, support, and sustainable compensation.

Improving the conditions under which content creators labor requires structural changes that go beyond individual resilience or platform-specific hacks. A few steps have been suggested towards improving their conditions. First, platforms must be held accountable through greater socio-technical transparency of their algorithms and regulatory oversight, in disclosing algorithmic logic and content moderation criteria. Without this, creators are trapped in cycles of uncertainty, suppression, and censorship. Second, fairer economic models are necessary to challenge the asymmetry of rentier capitalism. This could include more equitable revenue-sharing schemes from advertisements (currently, social media platforms capture the lion’s share of this revenue), the establishment of creator unions or cooperatives, and stronger protections against arbitrary demonetization or deplatforming. Crucially, the affective and relational labor that creators perform must be recognized as real work, deserving of rights, support, and sustainable compensation. As content creation on social media becomes a dominant mode of cultural production and economic activity, our regulatory and policy imagination must evolve to support the well-being and autonomy of those whose labor makes this model viable.