Ofcom promises quick action against tech firms on online safetyOfcom promises quick action against tech firms on online safety
Ofcom has pledged to move quickly to censure technology companies that fail to adequately police their online platforms as new safety regulations come into force.
December 16, 2024
The UK telecoms regulator, which added the title of online safety regulator to its remit just over a year ago when the Online Safety Act became law, on Monday announced that it has published its first-edition codes of practice and guidance on tackling illegal harms online.
Essentially, that means social media firms, search engines, app providers – messaging, gaming and dating – and owners of pornography and file-sharing sites are now required to take action to prevent criminal activity on their platforms. Specifically, they have three months to complete illegal harms risk assessments.
"We have already been speaking to many tech firms – including some of the largest platforms as well as smaller ones – about what they do now and what they will need to do next year," Ofcom said.
"While we will offer support to providers to help them to comply with these new duties, we are gearing up to take early enforcement action against any platforms that ultimately fall short," the regulator warned.
Such action could see companies who fail to meet their requirements fined up to £18 million or 10% of revenues, whichever is greater. In extreme cases the regulator can apply for a court order to have the offending site or app blocked in the UK.
Ofcom has been talking tough on applying its new powers from the outset, but now we're getting towards the crunch point where we will be able to judge for ourselves whether or not it has overpromised.
As outlined above, every site and app that comes under the scope of the new laws has until 16 March to complete an assessment to understand the risks illegal content poses to children and adults on its platform, Ofcom explained. The following day – assuming Ofcom's various codes of practice have passed through the parliamentary process by then – those sites and apps will have to start implementing safety measures to mitigate those risks.
And Ofcom has laid out some expectations on what these should entail, including the appointment of a senior person accountable for online safety, and appropriately resourced moderation teams, as well as robust reporting and testing of algorithms.
It is focusing hard on protecting children from sexual abuse and exploitation online, and has outlined a number of safety measures around the way children's profiles and locations are displayed, and how they can be contacted.
The regulator is also looking at the protection of women and girls; fraud identification; and the removal of terrorist accounts, amongst other things.
"For too long, sites and apps have been unregulated, unaccountable and unwilling to prioritise people’s safety over profits. That changes from today," declared Ofcom chief executive Dame Melanie Dawes.
"The safety spotlight is now firmly on tech firms and it’s time for them to act. We'll be watching the industry closely to ensure firms match up to the strict safety standards set for them under our first codes and guidance, with further requirements to follow swiftly in the first half of next year," she said. "Those that come up short can expect Ofcom to use the full extent of our enforcement powers against them."
And after all its strong words, Ofcom can expect to come under close scrutiny as its time to act approaches.
About the Author
You May Also Like