Ofcom fires warning shot to search engines on self-harm content
Ofcom has published the results of a study that demonstrates the ease with which users can access content promoting self-harm and suicide, coupled with a hint that a crackdown may be coming.
January 31, 2024
The UK telecoms regulator chose its words very carefully in the statement that accompanies the study, but it's pretty clear that it is designed to be a warning to the major search engines that closer scrutiny of their role is in the offing.
As part of its newly-acquired role as the UK's online safety regulator, Ofcom plans to launch a consultation in the spring into its Protection of Children Codes of Practice, a part of which covers the way minors are able to access harmful content online. The findings of the new report, 'One Click Away: A Study on the Prevalence of Non-Suicidal Self Injury, Suicide, and Eating Disorder Content Accessible by Search Engines,' which was commissioned by Ofcom and carried out by Network Contagion Research Institute (NCRI), will feed into that consultation.
The codes will set out the practical steps search services can take to protect children, Ofcom said.
"Search engines are often the starting point for people's online experience, and we're concerned they can act as one-click gateways to seriously harmful self-injury content," said Almudena Lara, Online Safety Policy Development Director, at Ofcom.
22% of search results in the study linked to content that glorifies or offers instruction about non-suicidal self-injury, suicide or eating disorders in a single click, Ofcom said; the study saw researchers analyse 37,000-plus results links returned by the big five search engines. Other points of note were that image searches are identified as a particular risk, delivering the highest proportion of harmful of extreme results at 50%, while the use of deliberately obscured search terms meant people were six times more likely to find harmful content about self-injury.
"Search services need to understand their potential risks and the effectiveness of their protection measures – particularly for keeping children safe online – ahead of our wide-ranging consultation due in Spring," she added.
It would be premature to suggest that Lara's comment indicates that Ofcom is about to take on the search engines. There has been endless discussion across the industry over the years about the role the Internet companies play – or should play – in policing the content they host or to which they facilitate access. Any attempt from a regulator like Ofcom to seriously increase the level of responsibility on them would be met with resistance.
But Ofcom is flexing its muscles a little bit. It namechecks the big five search services – Google, Microsoft Bing, DuckDuckGo, Yahoo! and AOL – in its announcement, and there is an indication it may ask for certain changes.
"Search services must act to ensure they are ready to fulfil their requirements under the Online Safety Act. Specifically, they will have to take steps to minimise the chances of children encountering harmful content on their service – including content that promotes self-harm, suicide and eating disorders," the regulator said.
While the Online Safety Act technically gives Ofcom significant powers to impose sanctions on even the largest companies, it is unlikely to start throwing its weight about. But it is taking its new role very seriously.
Indeed, the Financial Times recently reported that Ofcom has been on a hiring spree, picking up staff from big technology companies in particular. The regulator has apparently signed up 350 new employees with the remit of tackling online safety and has plans to hire at least 100 more.
This influx of staff probably accounts for the amount of work – and self-publicity – we are seeing from Ofcom in the online safety sphere. And there are going to be plenty more studies, reports, consultations and so forth as the regulator's new role develops.
About the Author
You May Also Like