Europe told it can force Facebook to do more on illegal content
An opinion revealed by Advocate General Macief Szpunar, a prominent advisor to the European Commission, has suggested Facebook can be forced to do more to crack down on illegal and offensive content.
June 4, 2019
An opinion revealed by Advocate General Macief Szpunar, a prominent advisor to the European Commission, has suggested Facebook can be forced to do more to crack down on illegal and offensive content.
Limiting and blocking content on social media sites is an incredibly difficult topic to address. Not only do you have the risk of alienating individuals by drawing a strict line on what is deemed offensive, there is also the danger of invading freedom of speech rights. Facebook has always tried to stay at arm’s length from the tricky conundrum, but the opinion from Szpunar might offer ammunition for red-tapers throughout Europe to hold the social media giant more accountable.
The document states:
“In today’s opinion, Advocate General Maciej Szpunar considers that the Directive on electronic commerce does not preclude a host provider which operates a social network platform, such as Facebook, from being ordered, in the context of an injunction, to seek and identify, among all the information disseminated by users of that platform, the information identical to the information that has been characterised as illegal by a court that issued that injunction.”
As with most legal matters, this is a highly complicated and nuanced case, not only because it deals with the intersection of speech moderation and freedoms, but also due to the fact that social media platforms are incredibly complex machines. It is not straight-forward applying new rules, especially when you consider the platforms are extended over multiple geographies, languages and legal jurisdictions.
In short, Szpunar suggests social media platforms can be forced to remove all content which is related to illegal content, and that the conditions can be applied to Facebook globally.
This case dates back to 2017 and a speech made by Austrian Green Party Eva Glawischnig. Glawischnig claimed that comments about her made on Facebook were defamatory, to which an Austrian court agreed, and Facebook was ordered to take down the posts. Facebook complied, but only in Austria, to which Glawischnig argued the action should be extended across the entire social media platform while also including any verbatim re-postings.
Further filings were made to the Court of Justice for the European Union, leading towards Szpunar’s opinion today. What is worth noting is that the court does not have to follow the opinion of Szpunar, but in most cases it does follow the opinions of the eleven appointed Advocate General’s.
In the opinion of Szpunar, Facebook can be told to spread the net further, suggesting there are no limitations in forcing social media platforms to comply globally, while it will also have to do more to remove identical content which has already been deemed illegal by the courts.
What some might find more objection to is some slight mission creep from the European Commission. The opinion suggests there is no reason Facebook cannot be forced to apply these rules globally, though over sovereign nations might object to be told how to govern their own states. This is another very sensitive area, especially at a time where international relations are fragile.
Europe is taking a much more stringent stance against the internet giants than many other nations around the world, though we suspect there will be critics suggesting it is overstepping the mark here. And to be fair, they would have a point. What right does Europe have in imposing its own opinions on free speech principles on other territories? Why should their approach be considered more appropriate than anyone elses?
This is not the first time it has been suggested Europe is overstepping the mark. The same Advocate General came to the conclusion Europe had the right to force Google to remove archive listings of some news stories on a global level last year.
In this case, two businessmen had served criminal convictions, but were arguing the cases should be deindexed as the punishment had been served, and there was a risk of future employment being impacted should the stories have been discovered.
This is a slightly different case, as the ‘right to be forgotten’ saga with Google leans on the principles of privacy and the fact the two individuals in question had been ‘rehabilitated’. The complication with Glawischnig is that it is based on the legal definition of defamation, which might vary from jurisdiction to jurisdiction.
Alongside the implications to freedom of speech, this does also force Facebook to become a more active moderator of the content which is published on its platform.
This is where Facebook has been able to dodge many bullets over the past decade; it is not a publisher, therefore should not be held accountable for the opinions, and management of those opinions, which are published on its platform. It has painted itself as the role of curator and platform provider, avoiding the term ‘publisher’, as this would imply it has more influence and control than it wants to have.
The selling point of many social media platforms is that it is ‘unmoderated’ content. Users can put anything they want online. In the early days, this freedom democratized opinion though there are now elements of society who use the platforms in ways deemed nefarious or contrary to societal benefit.
Not only does Facebook want to avoid the difficulty and legal complexities of becoming a more active moderator of content, it wants to remain true to initial function of the platform; freedom to do and say whatever the user wishes. This was attractive to users in the first years, and should Facebook want to re-engage the masses, it will have to offer an experience which is appealing.
And while the technology giants might not like the direction this case is heading, some governments certainly will.
Governments around the world are increasingly looking for ways to strengthen the grip they have on the internet industry. Part of this will be down to abuse of tax loopholes, some will be protecting the innocence of those reading material online and a slice is on increasing the capabilities of intelligence agencies and police forces.
There are of course numerous different approaches from governments when addressing illegal content and hate speech. Australia, for example, has passed the Sharing of Abhorrent Violent Material bill imposing harsh penalties on those who do not take down what the law decides is ‘abhorrent violent conduct’, though it still has it critics. In this case, there is some sympathy for the internet giants as everything they currently do is reactive, reliant on complaints from those who are already exposed to the materials.
Germany was one of the first, and strictest, countries to tackle the conundrum, while the UK has developed its own AI to identify the content which it has deemed as terrorist. The US is taking a much more hands-off approach, though there are clearly PR points to win for politicians here; the Senators and Congress representatives might find themselves being drawn into the debate before too long.
This is an immensely complicated area of social media. The courts will only want to make firm judgments when absolutely necessary, as many are cautious when it comes to precedent. Anything written in stone can be cleverly massaged by well-paid lawyers, expertly practiced in the field of nuance, to mean a variety of things.
This is also not the final judgment. The Court of Justice for the European Union usually follows the Advocate General’s opinion three to six months later, leaving plenty of opportunity for different arguments to be heard.
We’ve been keen to avoid the word censorship in this article, as it can be a very inflammatory way to describe an incredibly complicated and sensitive issue, but this is of course one of the risks which the world trends when you impose restriction on freedom of speech.
About the Author
You May Also Like