
TikTok is under pressure again. Research published by whore denounces that the platform allows the dissemination of sexualized videos of minorssuch real as created with artificial intelligencewhich work as a lure to attract users to child pornography hosted or marketed on external websites and Telegram channels.
The study concludes that TikTok not only tolerate these publications, but sometimes recommend them to users, even if they violate the Digital Services Act (DSA) and the platform’s own rules.
Maldita.es analyzed 40 beads with more than 1.5 million subscriberswhere they are 5,200 videos of minors created with AI And 3,600 videos of real minors. In the AI-generated clips, girls and teenagers appear dressed in tight clothing, bikinis or school uniforms, with shots focused on breasts and legs. In real videos, the content comes directly from the minors’ accounts, sometimes even with the watermark visible, allowing access to their authentic profiles.
In the comments, another pattern: sexualized messages, requests for contact and explicit emojis.
The investigation also reveals that many of these accounts use TikTok as showcase to direct users to Telegram pages and channels where child pornography is sold or exchanged. In the comments, terms such as “bought”, “change”, “change and sell” either “tlg”common in these illicit circuits.
To verify this, Maldita.es contacted 14 beads who promoted their Telegram channels. Eleven They sent an automatic menu with the catalog of contents for sale and Seven They sent child pornography without even asking. The media reported these cases to the national police.
The TikTok algorithm, part of the problem
The investigation also focused on the platform’s recommendation system. Using a test account, the team observed that TikTok began to suggest more sexualized videos of minors after watching a few clips. The recommendations appeared both in the section For you like in the search engine.
Among the suggestions are terms such as “Innocent Schoolgirls 13”, “dancing schoolgirls”, “little women” either “pretty and pretty girls”. The platform even recommended subscribing to information tables of accounts that distribute this type of content.
TikTok does not remove reported accounts, according to Maldita.es
The media reported 15 beads who published this type of videos, as well as 60 individual clips. Three days later:
- No accounts have been deletedexcept one with partial restrictions.
- TikTok only deleted 7 of 60 videos reported.
- In 11 cases, the recommendation was disabled in For You, but the videos remained accessible.
All this, despite the fact that Community Standards They explicitly prohibit the sexualization of minors, also when it comes to AI-generated content.
Furthermore, the Digital Services Act (DSA) requires platforms like TikTok to assess the risks of their algorithms and remove illegal content once detected. The lawyer specializing in data protection and digital law Marcos Judel warns on Maldita.es that the social network could face significant fines for failing to prevent the dissemination of illegal content, regardless of where it was generated.
Minors concerned can also report violation of the right to one’s own image and go to Spanish Data Protection Agency (AEPD) for inappropriate use of your personal data.
TikTok defends itself: “Zero tolerance”
Consulted by 20bits, the company rejects the accusations and ensures that it takes forceful action against this type of content.
“We have zero tolerance for child sexual abuse material, including AI-generated images created on other platforms. We are working intensively to detect and remove this aberrant content, as well as terminate accounts and report cases to NCMEC. 99.8% of the content we remove is removed before it is reported,” say TikTok sources.