The moderators of TikTok claim that during training, they watched footage of child sexual abuse
According to Forbes research, TikTok’s moderation crew allegedly gave unauthorized photographs and videos wide-ranging, unprotected access, raising concerns about how it manages child sexual abuse material.
Employees of Teleperformance, a company that provides third-party moderation services and works with TikTok among other firms, claim that the company ordered them to check a troubling spreadsheet titled DRR or Daily Required Reading on TikTok moderation guidelines. The spreadsheet purportedly featured “hundreds of images” of minors being molested or naked, in violation of TikTok’s rules. The workers claim that hundreds of people at TikTok and Teleperformance had access to the contents both inside and outside of the office, which allowed for wider exposure.
According to TikTok, its training materials include “strict access restrictions and do not feature visual instances of CSAM,” while Teleperformance disputed to Forbes that it exposed its staff to sexually exploitative information. TikTok did not confirm that all third-party vendors adhered to this standard. “Content of this nature is abhorrent and has no place on or off our platform, and we aim to minimize moderators’ exposure in line with industry best practices. TikTok’s training materials have strict access controls and do not include visual examples of CSAM, and our specialized child safety team investigates and makes reports to NCMEC,” TikTok spokesperson Jamie Favazza told sources.
The employees’ account is different, and as Forbes explains, it’s a risky legal one. CSAM that is posted on numerous social media sites is a regular problem that content moderators must deal with. However, depictions of child abuse are forbidden in the US and must be used with caution. The National Center for Missing and Exploited Children (NCMEC) must be notified of the content, and companies are required to keep it for 90 days while limiting the number of people who see it.
These accusations go much beyond that. They claim that while playing fast and loose with access to such content, Teleperformance gave staff graphic images and videos as examples of what to tag on TikTok. Although it’s unclear if one was opened, one employee claims she contacted the FBI to inquire whether the practice constituted criminally disseminating CSAM.
The complete Forbes piece, which describes a situation where moderators were unable to keep up with TikTok’s fast development and were instructed to watch crimes against children for reasons they thought didn’t make sense, is definitely worth reading. It’s an odd — and, if accurate, terrifying — situation even by the convoluted norms of online discussions about children’s safety.