Tech giants hit back against GCHQ’s ‘Ghost Protocol’
GCHQ’s new proposal to supposedly increase the security and police force’s ability to keep us safe has been slammed by the technology industry, suggesting the argument contradicts itself.
May 30, 2019
GCHQ’s new proposal to supposedly increase the security and police force’s ability to keep us safe has been slammed by the technology industry, suggesting the argument contradicts itself.
In an article for Lawfare, GCHQ’s Technical Director Ian Levy and Head of Cryptanalysis Crispin Robinson presented six principles to guide ethical and transparent eavesdropping, while also suggesting intelligence officers can be ‘cc’d’ into group chats without compromising security or violating the privacy rights of the individuals involved.
The ‘Exceptional Access Debate’ is one way in which GCHQ is attempting to undermine the security and privacy rights offered to consumers by some of the world’s most popular messaging services.
Responding in an open letter, the likes of the Electronic Frontier Foundation, the Center for Democracy & Technology, the Government Accountability Project, Privacy International, Apple, Google, Microsoft and WhatsApp have condemned the proposal.
“We welcome Levy and Robinson’s invitation for an open discussion, and we support the six principles outlined in the piece,” the letter states. “However, we write to express our shared concerns that this particular proposal poses serious threats to cybersecurity and fundamental human rights including privacy and free expression.”
Levy and Robinson suggest that instead of breaking the encryption software which is placed on some of these messaging platforms, the likes of Signal and WhatsApp should place virtual “crocodile clips” onto the conversation, effectively adding a ‘ghost’ spook into the loop. The encryption protections would remain intact and the users would not be made aware of the slippery eavesdropper.
In justifying this proposal, Levy and Robinson claim this is effectively the same practice undertaken by the telco industry for years. During the early days, physical crocodile clips were placed on telephone wires to intercept conversations, which later evolved to simply copying call data. As this is an accepted practice, Levy and Robinson see no issue with the encrypted messaging platforms offer a similar service to the spooks.
However, the coalition of signatories argue there are numerous faults to the argument. Firstly, technical and secondly, from an ethical perspective.
On the technical side, the way in which keys are delivered to authenticate the security of a conversation would have to be altered. As it stands, public and private keys are delivered to the initiator and recipients of the conversation. Both of these keys match, are assigned to specific individuals and only change when new participants are added to the conversation. To add a government snooper into the conversation covertly, all the keys would have to be changed without notifying the participants.
Not only would this require changes to the way encryption technologies are designed and implemented, but also it would undermine the trust users place in the messaging platform. Levy and Robinson are asking the messaging platforms to suppress any notifications to the participants of the conversation, effectively breaking the trust between the user and the brand.
While GCHQ can think it is presenting a logical and transparent case, prioritising responsible and ethical use of technology, the coalition also argues it is contradicting its own principles laid out in its initial article. Those principles are as follows:
Privacy and security protections are critical to public confidence, therefore authorities would only request access to data in exceptional cases
Law enforcement and intelligence agencies should evolve with technologies and the technology industry should offer these agencies greater insight into product development to help aid this evolution
Law enforcement and intelligence agencies should not expect to be able to gain access to sensitive data every time a request is made
Targeted exceptional access capabilities should not give governments unfettered access to user data
Any exceptional access solution should not fundamentally change the trust relationship between a service provider and its users
Transparency is essential
Although the coalition of signatories are taking issue with all six points, for us, it’s the last two which are the most difficult to grasp.
Firstly, if ‘Ghost Protocol’ is accepted by the industry and implemented, there is no way not to undermine or fundamentally change the trust relationship between the platform and the user. The platform promises a private conversation, without exception, and the GCHQ proposal requires data interception without knowledge of the participants. These are two contradictory ideas.
“…if users were to learn that their encrypted messaging service intentionally built a functionality to allow for third-party surveillance of their communications, that loss of trust would understandably be widespread and permanent,” the letter states.
The sixth principle is another one which is difficult to stomach, as there is absolutely nothing transparent about this proposal. In fact, the open letter points out that under the Investigatory Powers Act, passed in 2016, the UK Government can force technology service providers to hold their tongue through non-disclosure agreements (NDA). These NDAs could bury any intrusion or interception for decades.
It’s all very cloak and dagger.
Another big issue for the coalition is that of creating intentional vulnerabilities in the encryption software. To meet these demands, providers would have to rewrite software to create the opportunity for snooping. This creates two problems.
Firstly, there are nefarious individuals everywhere. Not only in the deep, dark corners of the internet, but also working for law enforcement and intelligence agencies. Introducing such a vulnerability into the software opens the door for abuse. Secondly, there individuals who are capable of hacking into the platforms that developed said vulnerability.
At the moment, encryption techniques are incredibly secure because not even those who designed the encryption software them can crack them. If you create a vulnerability, the platforms themselves become a hacker target because of said vulnerability. Finding the backdoor would be the biggest prize in the criminal community, the Holy Grail of the dark web, and considerable rewards would be offered to those who find it. The encryption messaging platforms could potentially become the biggest hacking target on the planet. No-one or no organization is 100% secure, therefore this is a very real risk.
After all these considerations to security vulnerabilities and breach of user trust, another massive consideration which cannot be ignored is the human right to privacy and freedom of expression.
Will these rights be infringed if users are worried there might be someone snooping on their conversation? The idea creates the fear of a surveillance state, though we will leave it up to the readers as to whether GCHQ has satisfied the requirements to protect user security, freedom of expression and privacy.
For us, if any communications provider is to add law enforcement and intelligence agencies in such an intrusive manner, there need to be deep and comprehensive obligations that these principles will be maintained. Here, we do not think they have.
About the Author
You May Also Like