Ofcom warns tech firms on kids' online safety

Ofcom has once again warned technology firms that they will be held to account once it has finalised new rules for keeping children safer online, and to that end it is about to consult on what exactly their responsibilities should be.

Mary Lennighan

April 19, 2024

3 Min Read

The UK's telecoms watchdog and, as of recently, online safety regulator on Friday published a raft of new studies looking at young people's interactions with social media and online content, and the behaviour of children and adults online. But its real message was around that upcoming consultation into children's safety that will ultimately feed into new rules that will impact on the tech companies.

Ofcom revealed it will launch a consultation in May on its draft Children's Safety Code of Practice. We knew that was coming, but previously the regulator had committed only to a 'spring' launch date.

The consultation will set out the practical steps that Ofcom expects technology companies to take to ensure children have safer experiences online. Once again, Ofcom reiterated that this is about access to content that is legal, but potentially harmful to children, such as pornography and content promoting suicide and self-harm.

The regulator also said it plans to launch an additional consultation – there will be many overlapping studies as Ofcom gets to grips with its online safety regulator role – into how automated detection tools, including AI, can be used to mitigate online risks. In this case it is referring to illegal content, such as previously undetected child sexual abuse material, as well as suicide and self-harm content. That study will feed into its illegal harms draft Codes of Practice, which are due to come into force towards the end of this year, although it's worth noting that Ofcom did not mention that target date this time.

The illegal harms codes will be the first to come into force as a result of the Online Safety Act, which became law in the UK in October. Amongst other things, the act gives Ofcom powers to censure technology companies for failing to meet their responsibilities – as yet undefined – to police harmful content on their sites.

"We are clear they must be ready to meet their new duties once in force and stand ready to hold them to account," Ofcom reiterated on Friday. It did not name any tech companies in particular, but has in the past made reference to major search companies Google, Microsoft Bing, DuckDuckGo, Yahoo! and AOL.

The Online Safety Act is a broad umbrella that covers areas of technology that are particularly difficult to regulate. Implementing specific rules and regulations was always going to be an onerous task, hence the myriad consultations from Ofcom. There will be many more to come before we get to a point at which the regulator's expectations are fully clear.

In the meantime, Ofcom is filling in the gaps with endless studies in its areas of focus.

Its latest data, from one of four reports published simultaneously, shows that young children are spending more time online and are being supervised less closely, a headline finding that sits comfortably alongside its regulatory agenda.

24% of children aged between five and seven now own a smartphone and 76% use a tablet, Ofcom said. It also highlighted an increase in online activity for children from the same age group. Its 2024 media use and attitudes report shows 65% go online to send messages, or make voice or video calls, up from 59% a year ago, while 50% watch live-streamed content, compared with 39% last year.

The use of social media sites and apps has risen to 38% from 30%, the most popular being WhatsApp and TikTok, but with Instagram and Discord also featuring strongly.

Naturally, increased online activity means greater risk of exposure to harmful content, which underscores why Ofcom was charged with its online safety remit in the first place. The regulator seems to be justifying its position – and its increased workforce – with reams of data, which it probably doesn't need to do: there is already broad agreement that this stuff is important. The discord comes when we start to look at ways of tackling it. And Ofcom isn't there yet.

About the Author(s)

Mary Lennighan

Mary has been following developments in the telecoms industry for more than 20 years. She is currently a freelance journalist, having stepped down as editor of Total Telecom in late 2017; her career history also includes three years at CIT Publications (now part of Telegeography) and a stint at Reuters. Mary's key area of focus is on the business of telecoms, looking at operator strategy and financial performance, as well as regulatory developments, spectrum allocation and the like. She holds a Bachelor's degree in modern languages and an MA in Italian language and literature.

You May Also Like