security

Last year there was a highly reported public debate between the FBI and Apple regarding the encryption used to protect an iPhone belonging to the San Bernardino terrorist shooter. The FBI wanted Apple to unlock the phone, but Apple argued that doing so was not only difficult, but it would put in danger all other similar devices. The dispute made its way to the courts, where eventually the FBI withdrew after facing fierce opposition from the tech industry.

This is a debate that has two very clear sides. On one side, “law and order” types argue that security agencies should get unlimited access to technological communications even if these are protected by strong encryption, arguing that the public interest is to stop terrorists. On the other side is the entirety of the technical industry and community, who argue that any access to a device would endanger all other devices. Both opinions are irreconcilable. Talking about the case at the time, EFF commented:

“The power to force a company to undermine security protections for its customers may seem compelling in a particular case, but this week’s order has very significant implications both for technology and the law. Not only would it require a company to create a new vulnerability potentially affecting millions of device users, the order would also create a dangerous legal precedent. The next time an intelligence agency tries to undermine consumer device security by forcing a company to develop new flaws in its own security protocols, the government will find a supportive case to cite where before there were none.”

The debate is now underway once more in the UK with the heinous terrorist attack that took place last week against Westminster bridge which resulted in the death of five people. It was reported that the terrorist had sent an Whatsapp message just before committing the terrorist act, and the UK’s Home Secretary Amber Rudd has been criticising tech companies for not allowing police and security services access to encrypted communications. This has drawn ire from the right-wing press like The Sun and the Daily Mail, with angry calls against tech giants such as Google and Whatsapp to help in the efforts against terrorism.

The problem with many of the articles calling for an opening of encryption is that they are often written by people who are completely clueless about the subject matter, and do not even realise just how misguided the calls for an end to encrypted tools are. Take the pictured clip from the Daily Mail, which classes WordPress as an offending app that allows “extremists plot in secret”. WordPress is nothing more than a blogging and communications platform (this blog runs on WordPress), and it makes no sense whatsoever to class it as something that allows extremists to communicate.

We are often sold the need to end encryption, or to enable torture, on the premise that at some point we will need to access a terrorist’s personal communications due to a pressing “ticking bomb” scenario. It is important to remember that this type of attack has never taken place outside of a Hollywood movie.

Furthermore, the number of terrorists plotting in communication tools is dwarfed by the number of everyday common users, so when newspapers and politicians call for an end to encryption in those tools, what they are actually saying is that they will affect how normal people communicate, making millions vulnerable to hacking. Thankfully, parts of the media have responded to the misguided calls in a rational manner, Jonathan Haynes in The Guardian wrote:

“While you can legislate to only give state agencies access to terrorists’ communications, and with proper oversight and authorisation, you cannot actually build encryption that works like that. If you put a backdoor in, it’s there not just for security services to exploit, but for cyber-criminals, oppressive regimes and anyone else. So how do you allow security services access to terrorist communications? The UK government could conceivably ban messaging companies that offer end-to-end encryption from operating in the UK. However, it is not clear how you would enforce that – and indeed it would be the people who do not want to be monitored who would find ways to avoid it.”

And Will Gore wrote in The Independent:

“Equally though, the Government must recognise that this is 2017. Technological advances may pose new challenges to law enforcement, but they have also – in a great many ways – made the world a better, and in some ways safer, place. To imply that we can markedly reduce the threat posed by determined terrorists by making encrypted online messages accessible to the police may be neat. But it’s stunningly unsophisticated, technologically lazy and almost certainly wrong.”

This is the issue that we are presented with. Calls to have technology firms offer backdoor access to private and encrypted communications must be read as a call to endanger everyone’s communications by making them easier to read by hackers. Moreover, encryption is not proprietary, it is just a clever use of maths, and there is no way that governments will ever be able to ban that. If somehow an app is made vulnerable, terrorists will move to another method, and we the public will still be left vulnerable.

But you may argue that we should never give up, and that the fight against terrorism is a worthy cause. It certainly is, but we cannot give up our expectations of security on the assumption that somewhere a terrorist is using an encrypted tool to communicate with one another. There is little evidence that this is the case, and even strong evidence to the contrary. The Paris terrorists used unencrypted burner mobile phones to communicate, and also favoured face to face contact.

We cannot give away our rights based on fables and ignorance.


1 Comment

Avatar

Burner Combustion Systems LLC · August 13, 2017 at 9:05 am

I understand that people want to reduce terror attacks, and anything that helps with that, for the most part, should be supported. However, when you talk about creating a back door on phones which weakens security for all, not to mention violates their privacy rights, you must weigh the increase in security to prevent terror attacks versus the weakened privacy for citizens as well as the decrease in security of phones moving forward.

The Burr-Feinstein bill (Compliance with Court Orders Act of 2016) was introduced last year in an attempt to require companies to create a ‘back door’ for government officials so they could access user data without the need for the user password. This bill had no support in the House and no support in the Senate, so Senators Burr and Feinstein decided to abandon efforts to pass the bill. However, this year they tried to revive it with the support of James Comey who said that half of FBI cases involve phones containing information which they cannot access. The bill went nowhere again.
I don’t believe we are at risk of it becoming law, now or anytime in the near future, but though the bill is highly unpopular, I wouldn’t get too comfortable. It’s just a matter of time (probably the next terror attack) before they will attempt to revive it.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.