Facebook moderators in Kenya fall ill after viewing shocking content
Kyiv • UNN
More than 140 Facebook moderators in Kenya have been diagnosed with severe PTSD due to viewing violent content. The workers filed a lawsuit against Meta and Samasource Kenya due to 8-10 hour shifts of viewing violence and cruelty.
More than 140 Kenyan Facebook moderators have been diagnosed with severe post-traumatic stress disorder. The massive diagnosis was conducted as part of a lawsuit filed against Facebook's parent company, Meta, and Samasource Kenya, a third-party company that moderated content for Meta.
Transmits UNN with reference to The Guardian.
More than 140 Facebook content moderators have been diagnosed with severe post-traumatic stress disorder caused by explicit social media content, including murder, suicide, child sexual abuse, and terrorism.
The moderators worked 8-10 hours a day at a facility in Kenya for a company that was contracted by Facebook.
The head of the mental health service at Kenyatta National Hospital in Nairobi noted that more than 100 people suffer from PTSD, generalized anxiety disorder (GAD) and major depressive disorder (MDD).
The massive diagnoses were made as part of a lawsuit against Facebook's parent company, Meta, and Samasource Kenya, an outsourcing company that moderated content for Meta using workers from all over Africa.
The images and videos, including necrophilia, bestiality and self-harm, caused some moderators to faint, vomit, scream and run away from their desks, the case file states.
This case sheds light on the human cost of the boom in social media use in recent years, which has required an increasing number of moderators, often in the poorest parts of the world, to protect users from the worst of the material some people post.
Recall
The Australian Parliament has passed a revolutionary law banning social media for children under 16. The law provides for fines of up to $32.1 million for platforms such as Facebook, Instagram, and TikTok.