Facebook, Africa Moderators
Meta, the parent company of Facebook, has stated that its contractors are obligated to pay their employees above the industry standard in their respective markets and provide on-site support from trained practitioners I Pic: Wikimedia Commons

Facebook moderators in Africa battle horror at work, fight back

On the verge of tears, Nathan Nkunzimana in Nairobi recalled watching a video of a child being molested and another of a woman being killed.

Eight hours a day, his job as a content moderator for a Facebook contractor required him to look at horrors so the world wouldn’t have to. Some overwhelmed colleagues would scream or cry, he said.

Now, Nkunzimana is among nearly 200 former employees in Kenya who are suing Facebook and local contractor Sama over working conditions that could have implications for social media moderators around the world.

Also Read: Facebook parent Meta cuts another 10,000 jobs

It is the first known court challenge outside the United States, where Facebook settled with moderators in 2020.

The group was employed at the social media giant’s outsourced hub for content moderation in Kenya’s capital of Nairobi, where workers screen posts, videos, messages and other content from users across Africa.

Their job was to remove any illegal or harmful material that breaches its community standards and terms of service.

The battle

The moderators from several African countries are seeking a USD 1.6 billion compensation fund after alleging poor working conditions, including insufficient mental health support and low pay.

Earlier this year, they were laid off by Sama as it left the business of content moderation. They assert that the companies are ignoring a court order for their contracts to be extended until the case is resolved.

Facebook and Sama have defended their employment practices.

“If you feel comfortable browsing and going through the Facebook page, it is because there’s someone like me who has been there on that screen, checking, Is this okay to be here?” Nkunzimana, a father of three from Burundi, told the Associated Press.

Also Read: India among top three sources for active users growth on Facebook: Meta

The 33-year-old said content moderation was like soldiers taking a bullet for Facebook users, with workers watching harmful content showing killing, suicide and sexual assault and making sure it is taken down.

For Nkunzimana and others, the job began with a sense of pride, feeling like they were heroes to the community, he said.

But as the exposure to alarming content reignited past traumas for some like him who had fled political or ethnic violence back home, the moderators found little support and a culture of secrecy.

Terrible work

After his shift, Nkuzimana would go home exhausted and often locked himself in his bedroom to try to forget what he had seen.

The salary for content moderators was USD 429 per month, with non-Kenyans getting a small expat allowance on top of that.

The Facebook contractor, US-based Sama, did little to ensure post-traumatic professional counselling was offered to moderators in its Nairobi office, Nkuzimana said.

Also Read: Priest dies by suicide in Ayodhya, live streams death on Facebook

Facebook parent Meta has said its contractors are contractually obliged to pay their employees above the industry standard in the markets they operate and provide on-site support by trained practitioners.

In an email to the AP, Sama said the salaries it offered in Kenya were four times the local minimum wage and that over 60 per cent of male employees and over 70 per cent of female employees were living below the international poverty line (less than USD 1.90 a day) before being hired.

Why Facebook?

Such work has the potential to be incredibly psychologically damaging, but job-seekers in lower-income countries might take the risk in exchange for an office job in the tech industry, said Sarah Roberts, an expert in content moderation at the University of California, Los Angeles.

Roberts, an associate professor of information studies, said the difference in the Kenya court case is that the moderators are organising and pushing back against their conditions, creating unusual visibility.

Facebook invested in moderation hubs worldwide after being accused of allowing hate speech to circulate in countries like Ethiopia and Myanmar, where conflicts were killing thousands and harmful content was posted in a variety of local languages.

(With agency inputs)

Read More
Next Story