Candie Frazier, a Las Vegas-based entrepreneur for the parent company of TikTok ByteDance, alleges that she and other content moderators often spend 12 hours a day reviewing disturbing content. She claims TikTok and ByteDance fail to provide adequate protections and psychological support to content moderators, according to the complaint.
“Complainant Frazier is viewing videos of the Myanmar genocide, mass shootings, raped children and mutilated animals,” the complaint states. “Due to constant, unmitigated exposure to highly toxic and extremely disturbing images in the workplace, Ms. Frazier has developed and suffers from significant psychological trauma, including anxiety, depression and stress disorder. Posttraumatic. ”
A spokesperson for TikTok said the company is not commenting on the pending litigation.
“We strive to promote a caring working environment for our employees and contractors,” said the spokesperson. “Our security team partners with third-party companies for the essential work of helping protect the TikTok platform and community, and we continue to develop a range of wellness services to make moderators feel supported. mentally and emotionally. ”
Frazier is not an employee of TikTok or ByteDance; instead, she works for a Canadian company called Telus International, which outsources content moderation workers to TikTok and other social media platforms. But Frazier alleges in the lawsuit that his work is dictated and supervised by TikTok and ByteDance. A spokesperson for Telus, who is not named as a party to the lawsuit, said Frazier never raised concerns about his work and that “his claims are grossly inconsistent with our policies and practices.” .
“We have a strong resilience and mental health program in place to support all of our team members, as well as a comprehensive benefits package for access to personal health and wellness services.” , the Telus spokesperson said. “Our team members can raise questions and concerns about any aspect of their work through multiple internal channels, which the company all takes very seriously. ”
The complaint alleges that problematic content is only reviewed by moderators after it has been uploaded to the platform if a user reports it. Because of the sheer volume of content entrusted to them, moderators only have 25 seconds to review each video and watch “three to ten videos at the same time,” he says. (TikTok did not immediately respond to a request for comment regarding these allegations.)
“These videos include cruelty to animals, torture, suicides, child abuse, murders, beheadings and other graphic content,” according to the complaint. “The videos are each sent to two content moderators, who review the videos and determine whether the video should remain on the platform, be removed from the platform, or have its audio muted.”
With the lawsuit, Frazier is seeking to have TikTok pay damages (in an amount to be determined later) to herself and other content moderators, and to develop a “medical surveillance fund” to pay for the cost. screening, diagnosis and treatment of psychological problems of such workers, according to the complaint.