‘I didn’t eat or sleep’: a Meta moderator on his breakdown after seeing beheadings and child abuse

TruthLens AI Suggested Headline:

"Meta Content Moderators Share Struggles with Mental Health After Exposure to Graphic Content"

View Raw Article Source (External Link)
Raw Article Publish Date:
AI Analysis Average Score: 7.0
These scores (0-10 scale) are generated by Truthlens AI's analysis, assessing the article's objectivity, accuracy, and transparency. Higher scores indicate better alignment with journalistic standards. Hover over chart points for metric details.

TruthLens AI Summary

Solomon, a newly hired Meta content moderator, began his role with the intention of contributing positively to social media by eliminating harmful content. However, he soon found himself overwhelmed by the disturbing nature of the material he was required to review. Within weeks, he encountered graphic videos depicting extreme violence, including beheadings and child abuse, which deeply affected his mental health. Despite the initial shock, he reported a troubling normalization of such content, leading him to question his own humanity. The emotional toll escalated when he learned of a childhood friend's murder, triggering a mental health crisis that resulted in destructive behavior and ultimately a suicide attempt. After being hospitalized for major depression, Solomon felt abandoned by his employer, Teleperformance, which offered him a lower-paid position instead of adequate support and compensation for his trauma.

Another moderator, Abel, shared similar experiences, highlighting the emotional strain of witnessing graphic content daily and the lack of support provided by their employer. He expressed concern for Solomon and criticized Teleperformance's handling of mental health issues among staff. Both moderators felt that the nature of the job had fundamentally altered their personalities, leading to numbness and a fear of discussing their struggles openly due to potential repercussions. Teleperformance responded to concerns by stating they offered psychological support and alternative roles, but both Solomon and Abel felt their issues were not adequately addressed. The case raises significant questions about the mental health implications for content moderators dealing with extreme content and the responsibilities of companies in supporting their employees' wellbeing.

TruthLens AI Analysis

The article reveals the harrowing experiences of a content moderator at Meta, shedding light on the severe mental health challenges faced by individuals working in this role. It explores the psychological toll of viewing graphic and disturbing content regularly, raising questions about the ethical implications of such jobs and the support provided to employees.

Mental Health Struggles

The narrative provides a personal account of Solomon's descent into depression following exposure to extreme violence and abuse. It illustrates how the normalization of horrific content can desensitize individuals and lead to a breakdown in mental health. This aspect is critical as it highlights the often-overlooked psychological burden borne by moderators, which may not be adequately addressed by employers.

Workplace Environment

The article hints at the inadequate working conditions provided to Solomon and his colleagues, such as shared living spaces and insufficient privacy. This contributes to a toxic work environment that may exacerbate the mental health issues faced by content moderators. The lack of privacy and support raises concerns about the responsibilities of companies like Meta and their outsourcing partners in safeguarding employee well-being.

Societal Implications

By sharing Solomon's story, the article aims to raise awareness about the broader implications of content moderation on societal health. It encourages readers to consider the mental health impact on those who keep social media platforms safe, fostering a greater understanding of the hidden costs of such employment. This could lead to calls for better support systems and changes in industry practices.

Manipulative Elements

There are aspects of manipulation in how the story is presented, particularly in the emotional weight of Solomon's experiences. The vivid descriptions of graphic content and the psychological impact may be intended to evoke a strong emotional reaction from the audience, potentially leading to a call for reforms in content moderation practices. This emotional appeal could overshadow rational discussions about the complexities of internet safety and moderation.

Connection to Broader Issues

The article draws connections to wider conversations about the responsibilities of tech companies in managing harmful content online. It raises awareness about the dark realities of social media and the human cost of maintaining these platforms. This aligns with ongoing public discourse regarding the regulation of social media and the ethical responsibilities of tech giants.

Potential Economic Impact

The revelations could provoke discussions about the financial implications for Meta and similar companies. If public sentiment shifts toward demanding better treatment of content moderators and improved mental health support, it could lead to increased operational costs for these companies. Moreover, potential boycotts or negative publicity might affect stock prices, particularly for companies heavily reliant on content moderation.

Target Audience

This article appears to resonate most with communities advocating for mental health awareness, labor rights, and ethical business practices. It targets readers who are concerned about the human aspects of technology and the implications of social media on society.

Reliability Assessment

The reliability of this article hinges on its sourcing and the authenticity of Solomon's account. If verified, it serves as a crucial testimony to the experiences of content moderators. However, the emotional framing may dilute the objective analysis of the broader systemic issues at play.

The combination of personal narrative and systemic critique makes this article a significant piece in understanding the complexities of content moderation and its effects on mental health.

Unanalyzed Article Content

When Solomon* strode into the gleaming Octagon tower in Accra,Ghana, for his first day as a Meta content moderator, he was bracing himself for difficult but fulfilling work, purging social media of harmful content.

But after just two weeks of training, the scale and depravity of what he was exposed to was far darker than he ever imagined.

“The first day I didn’t come across any graphic content, but gradually I started coming across very graphic content like beheadings, child abuse, bestiality. When I first came across that ticket I was very shocked. I didn’t even look at my computer because it was very disturbing for me.

“But gradually I started normalising what happened because I became used to it. I even started to enjoy seeing people beheaded, child abuse, pornography, suicide. I asked myself, is that normal? And I replied, it’s not.”

Solomon, who arrived from his home in east Africa in late 2023, said he would “never forget the day” he came across a person being gradually skinned alive. “The system doesn’t allow us to skip it … we have to look at it for 15 seconds, at least.”

Another video featured a woman from his home country screaming for help in his native language while several people stabbed her.

He says the videos became more and more disturbing. Some days, there would be no videos, then something would start trending and in the same day 70-80% of videos would feature graphic content. He felt himself gradually becoming “out of humanity”.

In the evenings, he would return to the shared flat provided by his employer, the outsourcing companyTeleperformance, with “no room for privacy and many problems with water and electricity”.

When Solomon learned that a childhood friend had been killed, his already fragile mental health unravelled. He wasbrokea window and a mirror in frustration,leadingTeleperformancetosuspend him until he felt better.

He spent the next two weeks home alone. “I started developing depression. I didn’t eat or sleep, I drank day and night and smoked day and night. I was not this kind of person before,” he said.

Solomon attempted suicide, and was admitted to a psychiatric hospital where, he said, he was diagnosed with major depression disorder with suicidal ideation. He was discharged after eight days, at the end of 2024.

Teleperformanceoffered to transfer him to a lower-paid job, but he feared he would not earn enough to survive in Accra. He asked forcompensation for harm and longer term psychological careto be covered but insteadTeleperformancesent him back to his home town, which is in the midst of an insurgency.

“You’re using me and throwing me. They treated me like a water bottle – you drink water and throw the bottle way,” Solomon said after his dismissal.

He said he had held a professional job in his home country, adding: “Before coming here I was so happy and peaceful.”

Another moderator, Abel*, shared how he, too, had had his contract terminated, for standing up for his friend Solomon and for the rights of other employees.

He said he had toldTeleperformance: “You’re not treating him well.”

“They just put him in the house. He stayed alone and he was telling me he’s scared being alone all the time, it was giving him severe stress, so he started to going to the company, ‘I want to stay with you guys, I want to be in office, I’m scared.’”

Abel, too, had struggled with his mental health because of the content. “I didn’t know the nature of the job, actually. I didn’t realise I’d see people skinned alive and porn videos as my daily job … This is the first time I’d heard of content moderators … I used to be spooked when I saw blood, but now I’ve become numb. Gradually I’ve seen it altering my character … I’m struggling. Not to exaggerate, it’s 100% changed me.”

He said his colleagues often sat around drinking coffee and discussing disturbing content, including religious colleagues sharing their feelings of shame.

He has come to fear raising such issues with the wellbeing coach, as he has seen his disclosures later being referred to by his team leader. When he said he no longer wished to use wellbeing services, which he believed were “for research on us”, he was challenged.

A Teleperformance spokesperson said: “Upon learning of his depression as a result of his friend’s death, we conducted a psychological assessment and determined that the employee was no longer well enough to continue providing moderation services.

“Instead, we offered him a different non-moderation role which he declined, insisting that he wanted to continue in this moderation role. Given this was not an option, his position ended and he was provided compensation according to his contractual agreement.

“During his employment and afterward, we continued offering the employee psychological support, which he has repeatedly declined. At the suggestion of his brother, so that family could help provide the employee with support, and with the approval of medical counsel, we provided the employee with his flight back to Ethiopia.

“We have continued to offer the employee psychological support in Ethiopia, however he has declined the support. Instead, he has tried to extort Teleperformance for money under threat of going to the media.”

*Names have been changed to protect identities

In the UK and Ireland,Samaritanscan be contacted on freephone 116 123, or emailjo@samaritans.orgorjo@samaritans.ie. In the US, you can call or text theNational Suicide Prevention Lifelineon 988, chat on988lifeline.org, ortext HOMEto 741741 to connect with a crisis counsellor. In Australia, the crisis support serviceLifelineis 13 11 14. Other international helplines can be found atbefrienders.org

Back to Home
Source: The Guardian