Online censorship pressure mounts ahead of 2024 election season

Recent developments in the EU, US and UK all contribute to a sense of increasing pressure to shape and constrain the digital public square.

The US-based Computer & Communications Industry Association (CCIA) today issued a press release with the following headline: “Media Freedom Act: EU Parliament Risks Enabling Spread of Harmful Content With Media Exemption”. The European Media Freedom Act (EMFA) aims to improve the functioning of the internal (EU) media market. Part of this involves trying to reconcile media freedom and plurality with the remands of the sweeping digital markets regulations being implemented by the bloc.

Among the extensive trade-offs and compromises being proposed to the EMFA is a media exemption that would impose a 24-hour delay to the censorship of whatever is arbitrarily deemed to be proper journalism. The CCIA thinks the resulting risk of wrongspeak being allowed to remain at large for a bit longer is unacceptable.

“The proposed media exemption is a dangerous loophole that could be used to spread harmful content and undermine our democracy,” said Mathilde Adjutor, CCIA Europe’s Senior Policy Manager. “CCIA Europe calls on EU policymakers to close this loophole and ensure that the EMFA complements, rather than undermines, the Digital Services Act’s ongoing implementation.

“If the European Union fails to act, rogue actors could use the full 24-hour window introduced by MEPs to spread misleading information – about elections for example – before online platforms are allowed to take it down.”

The EU recently released a study named “Digital Services Act: Application of the Risk Management Framework to Russian disinformation campaigns”. Its executive summary reads as follows: “During the first year of Russia’s illegal war in Ukraine, social media companies enabled the Kremlin to run a large-scale disinformation campaign targeting the European Union and its allies, reaching an aggregate audience of at least 165 million and generating at least 16 billion views.

“Preliminary analysis suggests that the reach and influence of Kremlin-backed accounts has grown further in the first half of 2023, driven in particular by the dismantling of Twitter’s safety standards.”

Since its acquisition by Elon Musk, Twitter (now X) has come under consistent censorship pressure and the Russia angle is nothing new. Twitter put itself in the EU’s crosshairs when it declined to sign up to a ‘voluntary’ disinformation code and this report seems designed, at least in part, to increase the pressure on Twitter to do what it’s told.

Another source of pressure for Twitter to censor more is coming from US organisation the Anti Defamation League (ADL), which has once more been pressuring social media platforms and search engines in general to censor more. Musk seems to have taken this personally and is now threating to sue the ADL for, you guessed it, defamation. You can see for yourself how the ADL frames its position via a TV appearance by its CEO in the second tweet below.

Lastly the contentious UK Online Safety Bill, which will give Ofcom sweeping censorship powers, looks set for yet another amendment. This time politicians are attempting to resolve the profound issues created by any attempt to interfere with the end-to-end encryption of instant messages.

DCMS junior minister Stephen Parkinson said the following to the House of Lords as part of a broader debate on the Online Safety Bill yesterday:  “Let me be clear: there is no intention by the Government to weaken the encryption technology used by platforms, and we have built strong safeguards into the Bill to ensure that users’ privacy is protected.

“While the safety duties apply regardless of design, the Bill is clear that Ofcom cannot require companies to use proactive technology on private communications in order to comply with these duties. Ofcom can require the use of a technology by a private communication service only by issuing a notice to tackle child sexual exploitation and abuse content under Clause 122.

“A notice can be issued only where technically feasible and where technology has been accredited as meeting minimum standards of accuracy in detecting only child sexual abuse and exploitation content. Ofcom is also required to comply with existing data protection legislation when issuing a notice under Clause 122 and, as a public body, is bound by the Human Rights Act 1998 and the European Convention on Human Rights.

“When deciding whether to issue a notice, Ofcom will work closely with the service to help identify reasonable, technically feasible solutions to address child sexual exploitation and abuse risk, including drawing on evidence from a skilled persons report. If appropriate technology which meets these requirements does not exist, Ofcom cannot require its use.”

That this statement is viewed as a partial win by privacy-focused messaging platform Signal (below) is a very positive sign. Having said that, there are still plenty of significant issues remaining with the Online Safety Bill, not least its desire to censor ‘legal but harmful’ content.

The electoral shocks of 2016 – Brexit and Trump – rattled the political establishment, which largely blamed social media for allowing the electorate to be led astray by baddies. The Cambridge Analytica storm in a teacup was an early expression of that and the Twitter Files revealed the subsequent escalation in online political censorship activity.

There is a saying that history may not repeat itself but it often rhymes. The Cambridge Analytica saga focused on alleged Russian meddling in the 2016 elections and Ukraine conflict appears to have ensured Russia will be offered as the main justification for political censorship in the current electoral cycle, which features major votes in the US and EU. That will be augmented by the usual claims of mis/dis/malinformation and other subjective, ill-defined pejoratives such as ‘hate’.

While online platforms do have to exercise some control over the speech they host, we have always argued that they should censor only in accordance with national laws. When alerted to censorship taking place at the tier 1 ISP level, the Electronic Frontier Foundation made a similar distinction. Would-be political censors often argue that they’re trying to protect democracy, but surely electoral outcomes occurring in an environment of ad hoc censorship are more likely to achieve the opposite.


Get the latest news straight to your inbox. Register for the newsletter here.

Leave a comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.