Ever since the CBS, 60 MINUTES documentary entitled ‘Training AI takes a heavy toll on Kenyans working for $2 an hour’ which I was featured in, aired four months ago, I’ve received many calls and texts. Some of them were from family and friends offering support and solidarity, but surprisingly, most of the calls were about questions such as ‘Where can I apply for this kind of job? And can you refer me?’ I was stunned. Did they not hear what we said? Each question felt like a betrayal. Had we not just explained how this job left us with night terrors, how our team was diagnosed with PTSD from reviewing child abuse, suicide, and self-harm footage before breakfast?

We spoke about exploitation, emotional exhaustion, and post-traumatic stress disorder (PTSD), generalized anxiety disorder (GAD), and the many other mental health disorders that this work had left us with. Yet after hearing all this, people still wanted in. That’s how desperate joblessness in Africa has become, so dire that people are willing to risk their well-being for a paycheck.

When unemployment hits 40 % in Africa, people don’t hear warnings, they hear salaries in USD. That’s the calculus of late-stage capitalism: exploitation masquerades as opportunity.

When unemployment hits 40 % in Africa, people don’t hear warnings, they hear salaries in USD. That’s the calculus of late-stage capitalism: exploitation masquerades as opportunity.

In this context, tech giants continue to exploit vulnerable youth in Africa through subcontractors, profiting from their labor and the data they generate, only to discard them once they raise their voices on how mentally taxing the job is and ask for better working conditions. These companies benefit from our work but offer nothing in return, no fair pay, no support, no concern for our mental health. And when we speak out, we are silenced, ignored, and get laid off from work.

It becomes impossible to ‘turn off’ the horrors we encounter. We carry them with us long after the workday ends, even after years of stopping the job.

For those of who were in the field and surely for the ones who are still doing this job, the mental and emotional toll of content moderation is not just a side effect, it is the reality of the job. PTSD, anxiety, depression the conditions many of us now live with are direct consequences of traumatic contents we engage with day in and day out. It becomes impossible to ‘turn off’ the horrors we encounter. We carry them with us long after the workday ends, even after years of stopping the job.

The mental and emotional toll of data work

PTSD or any traumatic experience has a way of making one feel that everyone and everything is against them. Any traumatic experience would make the world feel like a very gloomy place, and your mind a place of dark thoughts. Personally, after I was diagnosed with the many disorders that this job has left me with, I found it hard to be around people. I became uneasy, always on edge, emotional, and sensitive. The feeling of victimhood and some of the silencing tactics I faced from my former employer were intimidating. The untreated PTSD is very overwhelming. In countries like Africa, where talking openly about mental health is taboo, admitting that you’re dealing with psychological trauma, besides your childhood trauma, feels like putting a ‘kick me’ sign on your back to society that normally uses women as punching bags in any situation or circumstance, is unsettling. The feeling is overwhelming, especially for an immigrant with no support system.

Just like many of my colleagues, when I joined Sama, I thought my life would change—better pay, stability, and a future where I could work and grow professionally and on a personal level. For a while, it did seem so. But no one warned me that my daily tasks would include watching suicide videos and cataloging dead bodies.

For Ethiopian content moderators, in particular for those from Tigray, where a genocide was being committed against our own people while we moderate the war content, the job was a nightmare we couldn’t wake up from.

For 2 years, I believed scrubbing blood stains from Facebook live feeds was ‘ tech adjacent’ work, and I felt like I was in control and saving my community from harmful content so they could scroll in peace. Then came the migraines, the nightmares, the core sadness and anxiety, and then the day I realized I’d memorized the sound people make when they’re set on fire.

Every suicide video we tagged, every beheading we categorized, every racist slur we annotated, that’s what taught Zuckerberg’s algorithms to ‘moderate’ themselves. Our trauma and psychological scars became their machine’s intelligence.

The cruel irony is that the very AI we helped train is the same force stealing jobs from the people.

The cruel irony is that the very AI we helped train is the same force stealing jobs from the people. This isn’t just about Facebook. Every major tech company operates the same way, obsessed with domination, control, and profit. They exploit the vulnerable, use human intelligence to build their Artificial intelligence, extract public personal data, and get away with it.

Every content moderation center in Nairobi, every data labelling firm in Africa, every ‘microtask’ platform preying on Venezuelan migrants, we’re the ghost labor haunting their shiny algorithms. They need our pain to sanitize their platforms, but delete or deny our existence from their annual reports.

Safe data work, safe workers

This strenuous and traumatic work will not cease until AI is fully advanced, and even then, its impact on humans and workers must be taken into account. If AI’s advancement is inevitable, why do tech giants neglect safe and responsible AI practices? Is it truly impossible, or is it a matter of corporate greed and neglect? The truth is, it’s not impossible, it’s a choice.

Responsible AI should include measures to protect the workers who help train their systems.

Tech giants and their subcontractors don’t seem to care about humanity’s safety. Responsible AI should include measures to protect the workers who help train their systems. Safe data work demands better pay, better working conditions, and a safe, supportive environment that helps workers deal with the emotional toll of the job. Companies like Facebook, Teleperformance, Tik Tok and others must prioritize creating safer workspaces for their employees.

The importance of digital labor policy regulation in Africa

The conditions I’ve described are not unique to me, they are the reality for thousands of data workers across the Global South, particularly in Africa. These exploitative practices are intentionally hidden, designed to make the labor invisible and allow companies to escape responsibility.

So, when people ask me how to get into this work, I want to scream, “Open your eyes!’’

But I know why they’re asking. When survival is at stake, even poison starts to look like water.

For years, content moderators have been demanding better working conditions and a safer, more responsible approach to AI. On this year’s May Day, we call for solidarity, understanding, and compassion for data workers everywhere.

We need better working conditions, proper mental health care, fair compensation, and policies that protect workers, not just tech giants.

We need better working conditions, proper mental health care, fair compensation, and policies that protect workers, not just tech giants. The hidden workforce that powers AI should be recognized and celebrated, not erased.

These past few months I was able to attend workshops discussing efforts by civil organizations in Africa that are striving for fairness and safety in the tech industry, which is commendable, but it’s time for governments in Africa to pay attention to these realities and push forward for policies that protect workers from exploitative labor practices.

Moreover, on this May day, I urge everyone to stand in solidarity with data workers everywhere. Our labor is the backbone of the digital economy, and it’s time that we are recognized for the work we do and the sacrifices we make.

Moreover, on this May day, I urge everyone to stand in solidarity with data workers everywhere. Our labor is the backbone of the digital economy, and it’s time that we are recognized for the work we do and the sacrifices we make. It’s time for tech companies and governments to step up, ensure responsible AI development, and create safer working conditions for the people who make it all possible. We deserve better. We deserve respect. And we deserve the chance to heal from the scars this work has left on us.

Fasica Berhane’s report, published as part of the data workers inquiry, can be found here.