Blog content

TikTok sued by content moderator claiming she developed PTSD as a result of examining disturbing content

Candie Frazier, a Las Vegas-based entrepreneur for the parent company of TikTok ByteDance, alleges that she and other content moderators often spend 12 hours a day reviewing disturbing content. She claims TikTok and ByteDance fail to provide adequate protections and psychological support to content moderators, according to the complaint.

“Complainant Frazier is viewing videos of the Myanmar genocide, mass shootings, raped children and mutilated animals,” the complaint states. “Due to constant, unmitigated exposure to highly toxic and extremely disturbing images in the workplace, Ms. Frazier has developed and suffers from significant psychological trauma, including anxiety, depression and stress disorder. Posttraumatic. ”

The proposed class action lawsuit, filed last week in federal court in California, is likely to intensify scrutiny of problematic content and moderation practices at TikTok. The short video platform had previously gone unnoticed by bigger rivals such as Facebook and YouTube, but has gained attention in recent months from critics and lawmakers after exploding in popularity, especially among young people, during the pandemic. The company said in september that it had reached 1 billion monthly users.

A spokesperson for TikTok said the company is not commenting on the pending litigation.

“We strive to promote a caring working environment for our employees and contractors,” said the spokesperson. “Our security team partners with third-party companies for the essential work of helping protect the TikTok platform and community, and we continue to develop a range of wellness services to make moderators feel supported. mentally and emotionally. ”

Frazier is not an employee of TikTok or ByteDance; instead, she works for a Canadian company called Telus International, which outsources content moderation workers to TikTok and other social media platforms. But Frazier alleges in the lawsuit that his work is dictated and supervised by TikTok and ByteDance. A spokesperson for Telus, who is not named as a party to the lawsuit, said Frazier never raised concerns about his work and that “his claims are grossly inconsistent with our policies and practices.” .

“We have a strong resilience and mental health program in place to support all of our team members, as well as a comprehensive benefits package for access to personal health and wellness services.” , the Telus spokesperson said. “Our team members can raise questions and concerns about any aspect of their work through multiple internal channels, which the company all takes very seriously. ”

Facebook (FB) facing a similar trial in 2018 of a content moderator who reported developing PTSD after being exposed to content featuring rape, suicide and violence in the workplace. Among the criticisms that Facebook faced for its content moderation practices was the fact that moderation contractors did not reap the same benefits as corporate employees, despite being charged. of such a trying job. The social media giant finally accepted a Settlement of $ 52 million class action lawsuit, which involved payments and funding for mental health treatment for content moderators, as well as workplace changes.
A TikTok executive first testified on Capitol Hill in October and acknowledged the need to increase protections for young users on the platform. “We seek to earn trust through a higher level of action, transparency and accountability, as well as humility, to learn and improve,” said TikTok vice president and chief policy officer public Michael Beckerman to a Senate subcommittee. But Frazier’s lawsuit may point to the challenges of improving those protections.
Parents of the social media generation are not OK

The complaint alleges that problematic content is only reviewed by moderators after it has been uploaded to the platform if a user reports it. Because of the sheer volume of content entrusted to them, moderators only have 25 seconds to review each video and watch “three to ten videos at the same time,” he says. (TikTok did not immediately respond to a request for comment regarding these allegations.)

“These videos include cruelty to animals, torture, suicides, child abuse, murders, beheadings and other graphic content,” according to the complaint. “The videos are each sent to two content moderators, who review the videos and determine whether the video should remain on the platform, be removed from the platform, or have its audio muted.”

Theo Bertram, then TikTok’s Director of Public Policy for Europe, Middle East and Africa, told UK lawmakers in September 2020 that the company had 10,000 people working in its “trust and security” team around the world. TikTok earlier this year also launched an automated moderation system to analyze and remove videos that violate its “on upload” policies, though the feature is only available for certain categories of content.
The system manages “the categories of content for which our technology has the highest degree of precision, starting with violations of our policies on child safety, adult nudity and sexual activity, violent and graphic content, and illegal activities and regulated goods ”, a July blog post from TikTok’s Chief of Security in the United States, Eric Han, lit. “We hope this update will also support resiliency within our security team by reducing the volume of distressing videos viewed by moderators and allowing them to spend more time in highly contextual and nuanced areas.”
TikTok says 93% of infringing videos deleted between April and June 2021 were deleted within 24 hours of posting – the majority of which had no view and were reported by its automated system rather than reported by a user , according to a Report on the application of community directives released in October. (TikTok has not commented on Frazier’s claim that content moderators only review videos after they have been flagged by a user.)
Frazier also alleges that content moderators are required to sign non-disclosure agreements that “exacerbate the harm” caused by the job, according to the complaint. The practice of requiring workers to sign non-disclosure agreements has be under fire in the tech industry recently amid employee disputes at Pinterest, Apple and other big tech companies. TikTok did not immediately respond to a request for comment regarding its NDA practices.

With the lawsuit, Frazier is seeking to have TikTok pay damages (in an amount to be determined later) to herself and other content moderators, and to develop a “medical surveillance fund” to pay for the cost. screening, diagnosis and treatment of psychological problems of such workers, according to the complaint.