I was a content moderator for Facebook. I saw the real cost of outsourcing digital labour | Sonia Kgomo
Briefly

Mark Zuckerberg's announcement to replace Meta's independent fact-checking with community notes has sparked controversy, particularly regarding its implications for content moderation. At the AI Action Summit in Paris, former content moderator and speaker emphasized the need for tech companies to support and protect the mental health of workers who filter social media content. The speaker recounted their own experience of working under extreme conditions at Sama in Nairobi, facing the darkest internet content, which led to serious mental health issues due to the pressure and unrealistic expectations from the tech giant.
Meta’s shift from independent fact-checking to community notes could compromise the integrity of information on social media, risking public trust.
The pressures of content moderation at Sama reveal a systematic failure in how tech companies prioritize profit over the mental health of their workers.
The use of metrics like AHT in moderating content demonstrates a troubling trend, where speed is prioritized over the mental well-being of employees.
Our conversations as content moderators unveiled a larger issue: the exploitation and mental toll faced by those who hold the responsibility of filtering online content.
Read at www.theguardian.com
[
|
]