TikTok under US government investigation over child sex abuse material

0

TikTok is under investigation by US government agencies over its handling of child sex abuse material, as the booming short-form video app struggles to moderate a flood of new content.

Dealing with sexual predators has been a persistent challenge for social media platforms, but TikTok’s young user base has made it vulnerable to being a target.

The US Department of Homeland Security is investigating how TikTok handles child sexual abuse material, according to two sources familiar with the matter.

The Justice Department is also looking into how a specific privacy feature on TikTok is being exploited by predators, a person with knowledge of the matter said. The DOJ has a long-standing policy of not confirming or denying the existence of ongoing investigations.

“It’s a perfect place for predators to meet, care for and engage children,” said Erin Burke, chief of the Child Exploitation Investigations Unit at Homeland Security’s Cybercrime Division, l ‘calling it the “platform of choice” for the behavior.

Surveys highlight how TikTok is struggling to keep up with the torrent of content generated by over a billion users. The company, owned by China’s ByteDance, has more than 10,000 human moderators worldwide and has been rapidly hiring staff in this area.

The business is booming. An Insider Intelligence forecast puts TikTok’s ad revenue at $11.6 billion this year, three times higher than last year’s $3.9 billion.

Mark Zuckerberg, chief executive of Meta, blamed TikTok’s popularity among young people as the main reason for slowing interest in its longer-established social media platforms such as Facebook and Instagram.

But Meta has more experience dealing with problematic material, with around 15,000 moderators worldwide and using other automated systems designed to flag posts.

Between 2019 and 2021, the number of TikTok-related child exploitation investigations by Homeland Security increased sevenfold.

The social media networks use technology trained on a database of images collected by the National Center for Missing and Exploited Children (NCMEC), a centralized organization where companies are legally required to report child pornography material.

TikTok reported almost 155,000 videos last year while Instagram, which also has over a billion users, saw almost 3.4 million reports. TikTok received no takedown requests from NCMEC last year, unlike rivals Facebook, Instagram and YouTube.

“TikTok has zero tolerance for child sexual abuse material,” the company said. “Where we become aware of an attempt to publish, obtain or distribute [child sexual abuse material]we remove content, ban accounts and devices, immediately report to NCMEC and contact law enforcement if necessary. »

However, Homeland Security’s Burke claimed that international companies such as TikTok were less motivated when working with US law enforcement. “We want [social media companies] to proactively ensure that children are not exploited and abused on your sites – and I can’t say they do, and I can say a lot of American companies are,” he said. she adds.

TikTok said it removed 96% of content that violated its underage safety policies before anyone viewed it. Videos of underage drinking and smoking accounted for the majority of removals under these guidelines.

One pattern the Financial Times verified with law enforcement and child safety groups was that content was purchased and traded through private accounts, sharing the password with victims and other predators. Key code words are used in public videos, usernames and bios, but illegal content is uploaded using the app’s “Only Me” feature where the videos are only visible to those who are connected to the profile.

Seara Adair, a child safety activist, reported the trend to US law enforcement after first reporting the content on TikTok and learning that a video did not violate the policies. “TikTok constantly talks about the success of their artificial intelligence but a clearly naked child slips in,” Adair said. All accounts and videos referenced on TikTok by the FT have now been removed.

“We care deeply about the safety and well-being of minors, which is why we incorporate youth safety into our policies, enable default privacy and security settings on teen accounts, and limit features by age. “, added TikTok.

Share.

Comments are closed.