Cameron's filtering folly

UK government proposals would force UK ISPs to block access to adult material unless the bill payer opts in to receive it. The “think of the children” rhetoric is convincing and the sentiment is in the right place. But the solution – content filtering – is all wrong.

James Middleton

August 1, 2013

9 Min Read
Cameron's filtering folly
UK Prime Minister David Cameron has pledged to work with Germany on research and development on internet of things technology, 5G technology and strengthening the EU’s single digital market

Over the past couple of weeks, the UK press has dedicated a fair few column inches to government proposals that would force UK ISPs to block access to adult material unless the bill payer opts in to receive it. With pictures of Prime Minister David Cameron standing in front of an NSPCC banner splashed over the front pages, the “think of the children” rhetoric is convincing. Yes, the sentiment is in the right place, who would argue against child protection? But the process – content filtering – is not.

Under the announced proposals, UK internet users will have to actively opt-in to be able to access pornography. Family friendly content filters will automatically be put in place for new and existing internet users by year end if Cameron has his way and search engines will be pressured to ‘blacklist’ illegal search terms by October.

But the measures go further than blocking porn. The Open Rights Group (ORG), an activist organisation created to defend internet freedom and privacy, which counts author Neil Gaiman among its patrons, has done some digging and claims to have come up with an inside view on what the filter will target, for now.

Apparently users will be greeted with a list and expected to make their choices:

Do you want to block

☑ pornography

☑ violent material

☑ extremist and terrorist related content

☑ anorexia and eating disorder websites

☑ suicide related websites

☑ alcohol

☑ smoking

☑ web forums

☑ esoteric material

☑ web blocking circumvention tools

That’s a nice little cross section of internet content right there – including some startlingly vague and obscure categories: web forums and esoteric material? – and when you’re performing any kind of information filtering process, having lots of data to throw at the system is a useful thing. The more data a filter has, the more accurate it can be.

And this is where the media has been wrong headed about this story. All the focus has been on the word “pornography” when it should have been on the word “filter”. The cynic will say this proposal has nothing whatsoever to do with child protection and has everything to do with establishing a legal capability to filter all internet traffic – and in the wake of the PRISM revelation I’m inclined to hear the cynic out.

Once you have a system in place that is designed to filter one thing, it’s trivial to tweak that system to start filtering something else. That’s the easy technical bit. Getting it past the voting citizens can be a bit trickier; maybe a small change to the existing legislation; maybe you just do it on the quiet and claim it’s in the interests of national security. Using child protection as the reason to get such a system implemented in the first place is a campaign everyone can get behind but it doesn’t take much momentum to turn that campaign into the thin end of the wedge and as the ORG puts it, you have a situation where the UK is “sleepwalking into censorship”.

Cameron said, quite rightly, that he expects a “row” with ISPs which are, in his view, “not doing enough to take responsibility”. But the row won’t be because they don’t want to protect children from adult content, it will be over the technical impossibility of ensuring this proposal.

What we have here is a classic case of a technical solution being forced on a problem by people without technical understanding.

Why is such a filter even necessary in the first place? To the best of my knowledge one already exists. The Internet Watch Foundation (IWF) is a registered charity established in 1996, which states that its remit is “to minimise the availability of ‘potentially criminal’ internet content, specifically images of child sexual abuse hosted anywhere, and criminally obscene adult content in the UK”.

As part of its function, the IWF supplies an “accurate and current URL list to enable blocking of child sexual abuse content”. From 2010 the Office of Government Commerce (OGC) required all ISPs to use this list.

It’s effective in that it prevents accidental access to child pornography and I’m not being facetious when I say that if such a thing were so rife that you could stumble upon it unintentionally then we would have an entirely different problem to deal with. Indeed, the IWF long ago admitted that “images are sold or exchanged in other ways, for example via newsgroups or peer-to-peer” and said the blacklist was “never designed to try to stop offenders having access to this content”.

Web filtering software is almost a hygiene factor in any security software and family friendly filters are not hard to find. The UK public is not desperate for someone to develop a filter and search engines apply family friendly filters by default. The choice already exists and choice is important when it comes to censorship. Adults and parents should be responsible enough to decide if they want to self censor.

Moral issues aside, there are plenty of technical reasons why nationwide filtering is a bad idea. Filters are not infallible for a start. When the IWF was first set up, one of the most high profile accidental blocks was of Wikipedia.

Australia rode a similar wave of controversy to the UK in 2007 and went ahead and implemented a nationwide porn filter that set local ISPs back A$84m (that’s €58m in today’s money). After launch it took one 16 year old, who promptly made headlines, 30 minutes to break the system and bypass the filter.

Why should the UK’s “porn filter” be any less futile and expensive a project—one that might serve only to set the government up for embarrassment? The implementation of the plan at a residence level hasn’t been addressed but if it involves us all one day being greeted by a series of checkboxes, surely the first person to the computer has the power? Families don’t have a ‘computer room’ like they did 15 years ago. Any one of a number of devices is an internet terminal and kids have plenty of their own that perhaps their parents aren’t even aware of. It’s a sweeping statement but kids of a certain age, as highlighted by the Australian teenager, also tend to be more technically capable than their parents.

As a side note to the lack of technical awareness, part of the UK proposals also include the creation of a secure database of banned child pornography images gathered by police across the country, that will be used to trace illegal content and the paedophiles viewing it. Again the sentiment sounds laudable but whenever you have a large, centralised database, accessed by lots of people, you will always get leaks – where does Wikileaks get all its information from? The last thing the government wants is to be the actual source of banned child pornography images leaking out, but history suggests that is exactly what will happen. The UK’s track record on building large scale government databases is not enviable.

But plenty of countries fare little better. In Denmark, where censorship has been prohibited since 1849 by the country’s constitution, internet filtering crept in under the radar and was only brought to the public’s attention in 2008 when Wikileaks released a list of 3,863 sites being filtered. Some of these sites, while indecent, were perfectly legal. As a result, Denmark was forced to bring a formal framework into play in 2012 and has since been criticised for breaching its own constitution.

In Germany in 2011, the government finally gave up on plans to establish nationwide internet filtering in the name of clamping down on child pornography due to public outcry. Opponents of the proposal underlined its inefficiency and demanded more be done to actually tackle the real problem – deleting child porn websites rather than making them inaccessible to a particular nation.

And here’s the crux of it. Indeed there are dark corners of the internet where all sorts of horrible and illegal content is available but UK ISPs don’t have any control over that. They may be able to prevent casual access to illegal material but casual access isn’t the real problem. The people who fuel these dark economies in the darkest parts of the internet are technically astute enough that any filtering methods put in place will have zero effect on the actual problem.

When European authorities began campaigning against the Pirate Bay for example, the site came up with workarounds so the hard core of users could continue to access it and still do today. It was really only the casual users that were put off and the site itself persists to cater to the dedicated supporters.

It’s at this point an article on censorship crosses the tricky line into privacy because the two are intertwined in a grey quagmire of morality. Privacy is important. Whether or not it actually exists in a post-PRISM world is another article but the tools to ensure it certainly do. Encrypted communications are necessary for financial transactions to take place and for those responsible for protecting our freedom to talk to each other without fear of the bad guys getting wind. Encrypted communications are applauded by those ‘good states’ when used to aid noble democratic causes like the Arab Spring, or undermine oppressive regimes in Turkey or Russia. But they are problematic tools when used by the bad guys on the dark web.

You have to be careful with definitions too of course, just because a site exists on the dark web doesn’t necessarily make it evil. Some of these dark internet communities are relatively conscientious. Take the Silk Road for example. The site can’t be located nor can it be shut down because it’s only accessible through the encrypted peer to peer maze of The Onion Router (TOR). Amongst an amusingly benign collection of shopping categories including books and tickets for things like Alton Towers, you will find a (literally) mind bending array of drugs for sale. The site is run and patronised by people who think they know what’s better for themselves than their respective governments but in order to keep the site alive, users must abide by a strict code of conduct.

To use the site you must have an account and in the dark web, just as with eBay, whether you’re selling deckchairs or LSD, you live and die by your reputation. Child porn is outlawed, as are assassinations and as of last year guns are off the menu too. Yet within the onion-like layers of the dark web, all these things and more are readily available and there are plenty of sites out there with less scruples than the Silk Road.

Cameron’s filter does nothing to target these sites and this, arguably, is where the real horror exists. As a result you can see why there are those that are sceptical of the filter’s true purpose – that’s it’s just another key on the chain of the warden in the nanny state that jingles in time to the age old mantra: “If you have nothing to hide you have nothing to fear”.

So if Cameron and the UK government is serious about stamping out “things going on that are a direct danger to our children, in the darkest corners of the internet,” we should be looking to a different approach not one that just hides the problem, which is effectively what a filter does.

Read more about:

Discussion

About the Author

James Middleton

James Middleton is managing editor of telecoms.com | Follow him @telecomsjames

Subscribe and receive the latest news from the industry.
Join 56,000+ members. Yes it's completely free.

You May Also Like