The investigation has shown that the Tiktok algorithm recommends sexual content for the accounts of newly created children.
Tiktok algorithm recommends pornography and sexualized content for children’s accounts, says a new Global Witness report.
The organization’s researchers created 13 -year false profiles and activated safety settings, but still received research indications of sexually explicit terms. These tips have directed sex videos, including penetration images.
Tiktok said he was committed to offering safe and adequate experiences for minors. In addition, the company claimed to have undertaken an immediate action after being informed of the problem by the global witness.
The four false reports were created between July and August this year. The researchers used false birth dates, pretending to be 13 years old and has not been asked to provide other information to confirm their identities.
Pornography
Global witness researchers have also activated the “limited way” of the platform, which, according to Tiktok, prevents users from seeing “adult or complexes such as … sexually suggestive content”.
Even without looking, the researchers found openly sexual terms suggested in the “You might like” section. The suggestions led to the videos of women who simulate masturbation, who showed underwear in public places or show their breasts.
In extreme cases, the content included explicit pornographic films of penetration sex. These videos were incorporated into apparently harmless materials in a successful moderation strategy of the dribbling platform.
Ava Lee of Global Witness, the BBC said the discoveries were a “great shock” for researchers.
According to Lee, “Tiktok cannot only prevent children from accessing inappropriate content, it is suggesting as soon as they create an account”.
Global Witness is a group of activists who investigate how the main technological companies influence discussions on human rights, democracy and climate change.
The organization’s researchers identified the problem in April while leading other studies.
Videos removed
The global witness informed Tiktok, who said he had undertaken immediate action to solve the problem.
But at the end of July and August of this year, the group repeated the survey and discovered that the application still recommended the sexual content.
Tiktok said he had more than 50 resources designed to protect teenagers: “We are fully committed to providing safe and adequate experiences”. The app also said that it removes 9 out of 10 videos that violate the guidelines even before they are displayed.
After being notified by Global Witness, Tiktok said he had taken measures to “remove the contents that violated our policies and launch improvements in our research suggestion function”.
In the United Kingdom, children aged 8 to 17 spend two to five hours a day online, according to the Ofom survey. The study shows that almost all children of over 12 years have a cell phone and watch videos on platforms like YouTube and Tiktok.
Tiktok introduced a time limit on the 60 -minute screen by default for children under the age of 18 in 2023, but this limit can be deactivated by the configuration.
In March 2025, a complaint published by the BBC showed that Tiktok’s profits from live broadcasts of sexual content made by teenagers. The application maintains about 70% of the amounts paid.
Tiktok has learned about the exploitation of children and teenagers in live broadcasts and conducted internal investigations in 2022, according to the reports of a process of the American state of Utah. The accusation says that the company has ignored the problem because it has “considerably treated” with exploration.
Tiktok said that the ongoing action in the United States ignores the “proactive measures” adopted to improve the safety of the platform.
Minors protection legislation
On July 25 of this year, the children’s codes of children’s codes of the online security law) entered into force in the United Kingdom, imposing a legal duty to protect children on the Internet. The measure causes some services, in particular pornographic sites, to verify the age of users in the United Kingdom.
The law aims to make the network safer, especially for minors, and is implemented and supervised by the media regulation agency in the country.
Companies should adopt “highly effective control control” to prevent minors from accessing pornography. They must also adapt their algorithms to block the contents that encourage the car -movement, suicide or eating disorders.
Bankruptcy can lead to a fine of up to £ 18 million (R $ 118 million) or 10% of the global companies of the companies – the highest value prevails. Executives can also be arrested.
In severe cases, Ofcom may request a court order to remove websites or applications from the United Kingdom.
The change in the law generated a large debate in the United Kingdom.
Part of experts and activists support more rigid rules and even banks under 16 years of social networks.
Ian Russell, president of Molly Rose Foundation, who grew up after the death of the 14 -year -old daughter, said she was “dismayed by the lack of ambition” in the Codes of ofcom.
The main philanthropic institution for children in the United Kingdom, the National Society for the Prevention of Cruely Against Children (NSPCC) has criticized that current legislation does not guarantee protection in private messaging applications such as WhatsApp.
According to the entity, the final encryption “continues to involve an unacceptable risk for children”. In this type of encryption, the messages are coded leaving the phone of the person who sends them and can be decoded only on the phone of those who receive them.
Privacy supporters, on the other hand, affirm that the methods of verification of the age adopted by the United Kingdom are invasive and ineffective.
Silkie Carlo, director of Big Brother Watch, a British NGO who promotes campaigns on privacy and civil rights, said that these rules can lead to “security violations, privacy invasion, digital exclusion and censorship”.
Tiktok and regulation in Brazil
Tiktok is one of the most popular social networks in Brazil, present in 46%of cell phones, behind Instagram (91%) and Facebook (76%), according to Mobile Time/Opinion Box Survey. Unofficial estimates indicate about 100 million users in the country, which has 213 million inhabitants.
In June, the Federal Supreme Court (STF) expanded the regulation of digital platforms, defining that companies can be held responsible for the criminal contents published by third parties. The serious content, such as non -democratic messages, childish pornography and suicide incentives, should be actively removed, while others must only be eliminated after the notification.
On 18/18, President Luiz Inacio Lula da Silva sanctioned the digital statute of children and teenagers (Eca Digital), which establishes the responsibility of technological companies to protect under 18 years of harmful content.
“It is an error to believe that the great technicians take the car for car -trogple. This misunderstanding has already cost the lives of several children and teenagers. Different countries have advanced in the creation of legal devices for the protection of children and adolescents in the digital environment,” said the president.
The regulation of this segment will be responsible for the National Data Protection Agency (ANPD).
Source: Terra

Rose James is a Gossipify movie and series reviewer known for her in-depth analysis and unique perspective on the latest releases. With a background in film studies, she provides engaging and informative reviews, and keeps readers up to date with industry trends and emerging talents.