Former Moderator Sues Chaturbate for ‘Psychological Trauma’


Former Moderator Sues Chaturbate for 'Psychological Trauma'

This article was produced in collaboration with Court Watch, an independent outlet that unearths overlooked court records. Subscribe to them here.

A former content moderator for Chaturbate is suing the live-streaming porn platform for psychological trauma he claims he suffered after being exposed to “extreme, violent, graphic, and sexually explicit content” every day without industry-standard safeguards, according to a new lawsuit.

Neal Barber, who was hired by Bayside Support Services and Multi Media LLC—the parent company of Chaturbate—in 2020, filed a lawsuit on July 22 claiming that those companies “knowingly and intentionally failed to provide their content moderators with industry-standard mental health protections, such as content filters, wellness breaks, trauma-informed counseling, or peer support systems.” The lawsuit is a proposed class action for moderators hired in the last four years to moderate Chaturbate streams. 

💡
Do you know anything else about moderation at social media and adult websites? I would love to hear from you. Using a non-work device, you can message me securely on Signal at sam.404. Otherwise, send me an email at sam@404media.co.

“The company has not been served nor has it reviewed the complaint and therefore cannot comment on the matter at this time,” a spokesperson for Multi Media LLC told 404 Media. “With that said, it takes content moderation very seriously, deeply values the work of its moderators, and remains committed to supporting the team responsible for this critical work.” 

Chaturbate hosts live cam shows by adult performers, and most shows also feature a live chat where users can tip performers, talk to each other, and request specific sex acts or pay to take a model “private” for short exclusive camming sessions.

“Because platforms like Chaturbate host vast amounts of live, unfiltered, and sexually explicit content, content moderators are essential to maintain compliance with legal standards, enforce platform rules, and prevent the dissemination of illegal or abusive material,” the lawsuit says. “They serve as the first line of defense against child exploitation, non-consensual content, violent content, obscene content, self-harm, and other violations. Without content moderators, Chaterbate.com would quickly become unmanageable, unsafe, and legally vulnerable.” 

Barber claims Chaturbate doesn’t adequately protect moderators, which the lawsuit says the platform calls “Customer Service Risk Supervisors.” The lawsuit alleges that Chaturbate doesn’t use many of the industry-standard practices for protecting moderators against psychological harm, such as grayscaling content or muting auto-playing videos or mandating wellness breaks and offering trauma-informed supervision or psychological support. 

“Without these safeguards, Mr. Barber eventually developed full-blown PTSD, which he is currently still being treated for,” Chris Hamner, an attorney representing the plaintiff in this case, told 404 Media. “The class action we have filed seeks redress for Mr. Barber and other content moderators like him who are battling the effects of this harmful content moderation work on the Chaturbate platform. It’s a negligence argument based on breach of duty of care.”

The lawsuit alleges that moderators like Barber “continue to be routinely exposed to some of the most graphic, disturbing, obscene and psychologically damaging content found anywhere online. Their jobs require them to monitor live-streamed material which too often involves child sexual abuse imagery, self-harm and suicide threats, extreme violence, and highly obscene, degrading, or dehumanizing sexual acts. Much of this content is created to be intentionally shocking, often non-consensual, and designed to provoke trauma.”

Exposure to this content, the lawsuit claims, has resulted in “vivid nightmares, emotional detachment, panic attacks, and other symptoms consistent with PTSD for Chaterbate.com content moderators.”

The lawsuit is the latest in recent years where moderators have sued platforms for alleged lack of protections. In 2019, media reports including an expose by the Verge about Facebook brought light to the topic of moderation and the psychological trauma platform moderators can endure when working with user-generated content. In 2022, a former moderator for Pornhub’s parent company told the Verge about their time at the company a decade ago, which involved, in part, moderating assault videos. Also in 2022, Youtube paid $4.3 million to moderators to settle a lawsuit that alleged they didn’t receive proper protections from the platform while viewing disturbing content. And a 2024 report by the Intercept found that Brazilian moderators were paid pennies to moderate extreme content on X, with no psychological support. This is the first time Chaturbate has faced a lawsuit for its alleged moderation practices.

Last year, the state of Texas sued Chaturbate and several other porn sites, complaining that the sites were not complying with Texas’ age verification law. Chaturbate paid $675,000 as part of a settlement. In May, a woman in Kansas sued Chaturbate, claiming that it was the site’s fault that her teenage son found her old laptop unlocked in a closet and used it to access porn without age verification in place.

Scroll to Top