The Popular Video Platform Allegedly Directs Children's Profiles to Pornographic Content In Just a Few Taps
As reported by a new study, the widely-used social media app has been observed to direct children's accounts to pornographic content within a small number of clicks.
How the Study Was Conducted
Global Witness set up simulated profiles using a date of birth for a minor and activated the platform's content restriction feature, which is meant to limit exposure to adult-oriented content.
Study authors observed that TikTok recommended inappropriate and adult-themed search terms to the simulated accounts that were created on new devices with no prior browsing data.
Troubling Search Prompts
The terms proposed under the "recommended for you" feature included "very very rude skimpy outfits" and "very rude babes" – and then escalated to keywords such as "hardcore pawn [sic] clips".
Regarding three of the accounts, the adult-oriented recommendations were recommended right away.
Fast Track to Adult Material
After a "small number of clicks", the researchers found explicit material including revealing content to explicit intercourse.
The organization stated that the content attempted to evade moderation, often by showing the video within an innocuous picture or video.
In one instance, the procedure took two taps after signing in: one interaction on the search bar and then a second on the recommended term.
Regulatory Context
The research entity, whose scope includes examining digital platforms' effect on human rights, reported performing several experimental rounds.
The first group occurred before the implementation of minor safety measures under the UK's Online Safety Act on the 25th of July, and another following the measures took effect.
Alarming Results
Researchers noted that multiple clips showed someone who seemed to be below the age of consent and had been sent to the online safety group, which oversees online child sexual abuse material.
The research organization claimed that TikTok was in non-compliance of the Online Safety Act, which obligates digital platforms to stop children from viewing dangerous material such as explicit content.
Regulatory Response
An official representative for Britain's media watchdog, which is tasked with overseeing the act, commented: "We acknowledge the research behind this research and will analyze its findings."
The regulator's guidelines for adhering to the act indicate that digital platforms that carry a medium or high risk of displaying dangerous material must "configure their algorithms to remove inappropriate videos from young users' timelines.
The platform's rules forbid pornographic content.
Platform Response
The social media company said that upon receiving information from the organization, it had removed the problematic material and introduced modifications to its recommendation system.
"As soon as we were made aware of these allegations, we took immediate action to examine the issue, remove content that violated our policies, and introduce upgrades to our search suggestion feature," stated a company representative.