So, David Cameron has stated that the UK will look to switch off some forms of encryption, and that this will contribute against the threats to our society [Link]. To make such a statement might be grand in its scope, and makes a great sound bite, but it is impossible to implement, and extremely naïve from a technical point-of-view. It is also negligent in protecting users from a range of threats.
There are so many holes in the argument to restrict encrypted content, including:
The legal system often takes a while to catch-up with technology, and generally new laws are created from a foundation built from the past. With the Internet and on computer systems, law enforcement has led a privileged life, where the easily investigation the disks of suspects, as there was little in the way of security. The increasing focus on privacy, especially on encrypting by default, is placing a major challenge on law investigators. The increasing use of multi-factor authentication also provides major challenges, and thus there is a major on-going battle between the rights of privacy against the rights to investigate.
This article outlines some of the tensions that are being felt on both sides of the argument. For many reasons, many of the rights we have built up in traditional investigations, such as the right to remain silent, are now being challenged in this Cyber Age.
The ability for defence agencies to read secret communications and messages gives them a massive advantage over their adversaries, and is the core of many defence strategies. Most of the protocols used on the Internet are clear-text ones, such as HTTP, Telnet, FTP, and so on, but increasing we are encrypting our communications (such as with HTTPS, SSH and FTPS), where an extra layer of security (SSL) is added to make it difficult for intruders to read and change our communications. When not perfect, and open to a man-in-the-middle attack, it is a vast improvement to communicating where anyway how can sniff the network packets can read (and change) the communications. The natural step forward, though, is to encrypt the actual data before it is transmitted, and when it is stored. In this way not even a man-in-the-middle can read the communications, and the encryption key only resides with those who have rights to access it.
While many defence mechanisms in security have been fairly easy to overcome, cryptography – the process of encrypting and decrypting using electronic keys – has been seen as one of the most difficult defence mechanisms to overcome. It has thus been a key target for many defence organisations with a whole range of conspiracy theories around the presence of backdoors in the cryptography software, and where defence agencies have spied on their adversaries. Along with the worry of backdoors within the software, there has been several recent cases of severe bugs in the secure software, and which can comprise anything that has been previous kept secure.
Most encryption uses a secret encryption key, which is used to encrypt and also to decrypt. This is known as private-key encryption, and the most robust of these is AES (Advanced Encryption Standard). The key must be stored someone, and is typically placed in a digital certificate which is stored on the computer, and can be backed-up onto a USB device. The encryption key is normally generated by the user generating a password, which then generates the encryption key.
Along with this we need to provide the identity of user, and also that the data has not been changed. For this we use a hash signature, which allows for an almost unique code to be created for blocks of data. The most popular for this is MD5 and SHA.
Encryption is the ultimate nightmare for defence agencies, as it makes it almost impossible to read messages from enemies. The possibilities is to either find a weakness in the methods used (such as in OpenSSL) or with the encryption keys (such as with weak passwords) or, probably the easiest is to insert a backdoor in the software that allows defence agencies a method to read the encrypted files.
There has been a long history of defence agencies blocking the development of high-grade cryptography. In the days before powerful computer hardware, the Clipper chip was used, where a company would register to use it, and given a chip to use, and where government agencies kept a copy of it.
in 1977, Ron Rivest, Adi Shamir, and Leonard Adleman at MIT developed the RSA public key method, where one key could be used to encrypt (the public key) and only a special key (the private key) could decrypt the cipher text. Martin Gardner in his Mathematical Games column in Scientific American was so impressed with the method that he published an RSA challenge for which readers could send a stamped address envelope for the full details of the method. The open distribution of the methods which could be used outside the US worried defence agencies, and representations were made to stop the paper going outside the US, but, unfortunately for them, many papers had gone out before anything could be done about it.
Phil Zimmerman was one of the first to face up to defence agencies with his PGP software, which, when published in 1991, allowed users to send encrypted and authenticated emails. For this the United States Customs Service filed a criminal investigation for a violation in the Arms Export Control Act, and where cryptographic software was seen as a munition. Eventually the charges were dropped.
And the rest of the article: