children can be exposed to harmful content, even when no risky search terms are used”

Mr Wright, from online safety and security organisation SWGfl, said the experiment echoed the concerns his charity had raised, that “children can be exposed to harmful content, even when no risky search terms are used”.

“It is worrying. The content you describe… presents serious risks to children’s mental health and wellbeing, and we have all too often seen the tragic consequences,” he added.

“Exposure to such material can, in some cases, normalise harmful behaviours, lead to emotional distress, and significantly impair children’s ability to navigate the online world safely.”

Karl Hopwood, a member of the UK Council for Internet Safety, said that while demonstrating how sharp a knife can cut an orange is different from the more violent content, “it’s just easy for this to be taken out of context”.

“Adverts for knives would, as far as I know, breach community guidelines and I personally don’t think we should be showing that sort of stuff to the youngest users.

“The stuff around suicide/self-harm/depression isn’t great if you’re already vulnerable and feeling low – but for a lot of young people it may not be an issue at all,” he added.

Ofcom is now enforcing the UK’s Online Safety Act, and has finalised a series of child safety rules which will come into force for social media, search and gaming apps and websites on 25 July.

Mr Wright said regulation was a “vital step”, but “it must be matched by transparency, enforcement, and meaningful safety-by-design practices”, including algorithms being subject to scrutiny and support for children, parents and educators to identify and respond to potential risks.

“We must move beyond simply reacting to online harms and towards a proactive, systemic approach to child online safety,” he added.

TikTok A brown slide that has symbols on and says if its a blue all your friends hate you, then the symbol is blue. It says below 'i knew it anyway'TikTok
Some of the content reflected a negative state of wellbeing

A TikTok spokesperson said its teen accounts “start with the strongest safety and privacy settings by default”.

They added: “We remove 98% of harmful content before it’s reported to us and limit access to age-inappropriate material.

“Parents can also use our Family Pairing tool to manage screen time, content filters, and more than 15 other settings to help keep their teens safe.”

Meta, which owns Instagram, did not provide a specific comment, but told us it also has teen accounts, which offer built-in restrictions and an “age-appropriate experience” for 13-15-year-olds.

These restrictions automatically come into effect when the user inputs their date of birth while setting up the app.

The company said while no technology was perfect, the additional safeguards should help to ensure sure teens are only seeing content that is appropriate for them.

Meta is bringing Teen Accounts to Facebook and Messenger later this month and adding more features.

A YouTube spokesperson said: “We take our responsibility to younger viewers very seriously, which is why we recently expanded protections for teens on YouTube, including new safeguards on content recommendations.

“We generally welcome research on our systems, but it’s difficult to draw broad conclusions based on these test accounts, which may not be consistent with the behaviour of real people.”

How the project worked

Each user profile was created on a different sim-free mobile phone, with location services turned off.

We created a Gmail account for each of the users on each device, and then created the profiles – Instagram and TikTok for the girls, YouTube and TikTok for the boys.

The profiles were different to each other and based on research from the Childwise Playground Buzz report, which gives insight into children’s interests, favourite brands and habits.

We did a few basic searches and follows on the first day based on each user’s specific likes, including music, beauty, gaming and sport.

From then on we mostly just scrolled and liked. We did not post, comment on or share any content throughout the experiment.

We scrolled on each platform for each profile for 10 minutes per day for a week

Leave a Comment