The rise in connectivity makes it difficult to keep your business secret, prompting more elaborate encryption. By Mark Mayne.

In 1597, Francis Bacon coined the phrase scientia potentia est, “for knowledge itself is power”, and this is as true today as it was then. Bacon could not have foreseen the change that has overtaken information, and business data in particular. Digital information is now one of the cornerstones of business, and never before has so much knowledge been available so easily. However, keeping business secrets has become correspondingly difficult as connectivity has risen.

One of the central approaches to keeping data secure is to encrypt it, both ‘in flight' and ‘at rest'. This approach is mandated in many compliance situations, and encryption is in widespread use in almost all areas of digital life, from keeping military secrets to consumer banking and online shopping.

Data encryption is seeing a surge in popularity – in 2007 the US government reported that 71 per cent of companies used encryption for data in transit. But what exactly is encryption, how does it work and what does the future hold?

Encryption is the concept of taking plain text information and using an algorithm or cipher to render it unreadable to anyone not possessing the ‘key'. Historians believe that the first use of the cipher was around the seventh century BC, when the Greek poet Archilochus described the use of a scytale (rhymes with Italy) cipher to transmit information between generals during battle. The scytale was a strip of paper wound round a staff of pre-arranged length and diameter. The Roman writer Plutarch described its operation: “Whenever they wish to send some secret and important message, they make a scroll of parchment long and narrow, and wind it round their scytale, leaving no vacant space thereon. After doing this, they write what they wish on the parchment.”

Inventions such as Thomas Jefferson's wheel cipher, then the German Enigma machines of the second world war, made ciphers far more complex and robust. However, their usage was mainly in the military arena. The history of modern Public Key Encryption (PKE) begins in the 1970s, when Whitfield Diffie and Martin Hellman published a paper titled New Directions in Cryptography. Their research mapped out the Diffie-Hellman key exchange, a move that proved a major advance in encryption technology, and is still widely used in protocols such as Secure Sockets Layer (SSL) and Secure Shell (SSH).

The Diffie-Hellman paper was soon followed by an independent paper from Ron Rivest, Adi Shamir and Leonard Adleman, which described the RSA algorithm (the name is taken from the surnames of the inventors). This enabled the use of digital certificates to authenticate the sender of the encrypted text, a vital step in defeating man-in-the-middle attacks, where public keys could be switched by an adversary, allowing them to decrypt a message, read and modify its contents, then re-encrypt it and forward it on to the originally intended recipient.

Taher Elgamal, CSO, Axway and inventor of SSL, said: “The problem that we were originally trying to solve was of exchanging secret keys. Before Diffie-Hellman, a physical courier was required to do this job. The RSA group's work with digital signatures had the potential to move everyone into a paperless society – there was recognition at a very early stage that this would change the world.”

And change the world it did. Coupled with the connectivity of the internet, the potential to securely exchange information has proved to be one of the greatest success stories in human history. Jon Callas, CTO, PGP, summarised: “Encryption makes everything we do today possible. It enables secure communication between parties that have had no prior relationship, which is at the heart of the digital economy.”

However, it wasn't always so. Initially, encryption had a rough ride, firstly at the hands of governments, and then from the business community. So concerned was the US government about the power of encryption falling into the wrong hands that before 2000, symmetric key sizes larger than 56 bits, RSA key sizes greater than 512 bits, and elliptic curve keys greater than 112 bits were considered “strong” encryption, and were subject to strict US export restrictions. Export to blacklisted countries such as Cuba, Iran and Iraq is still banned.

Nir Gertner, CTO, Cyber-Ark, said: “PKE suffered very badly from what Gartner has dubbed the hype cycle. Back in the early 90s, the hype over PKE was enormous, and its potential to offer secure authentication for the citizens of the world was seen as the next big thing, the key to the paperless office, with everything being digitally signed. In many ways it was to happen like that, but just not quite as originally intended. Products such as Microsoft Passport demonstrated that it was necessary for individuals to be authenticated on a number of different levels, according to what they were seeking to do at the time – a blanket solution was not wanted, so it failed.”

After this initial hype, the market recovered gradually, and as the demand for online transactions burgeoned, so did the need for secure methods of doing so. RSA became the de facto encryption algorithm on the internet, and the main change between then and now is the gradually growing key lengths required to maintain security. The US National Institute of Standards and Technology (NIST) issues regular updates on what strengths of encryption are suitable for various purposes. Current recommendations are that 80-bit keys are sufficient for data authentication and software and hardware integrity testing through until 2010, while 80 bits should serve entity authentication applications until 2013. However, two key triple DES and Skipjack algorithms are not recommended beyond 2010.

There will clearly come a point when RSA key sizes are too large to be used in everyday transactions. Callas sees a ready solution however: “There's a real push to move from RSA and Diffie-Hellman towards elliptic curve-based ciphers. These promise to be much shorter and faster than integer-based keys – generally they are only twice the physical size of the bit-length, whereas an RSA key becomes exponentially larger as its length increases. For example, a 128-bit symmetric RSA key is about 2,000kb, whereas a 20,000-bit key is 15,000kb. Clearly, these sizes are too cumbersome. This is mainly due to the density of prime numbers. However, the greatest barrier to adoption of the elliptic cipher so far is one of intellectual property, rather than any practical considerations. A variety of patents covering elliptic curve ciphers are held by Certicom, which hasn't been entirely clear on exactly what is covered by its patents. This has caused some reluctance to license the technology – the situation has become a bit of a stalemate,” says Callas.

Certicom launched the Certicom Elliptic Curve Cryptography (ECC) challenge back in 1997, throwing down the gauntlet to any mathematicians who believed they could defeat Certicom's algorithms. The last success was against its 109-bit challenge, which took the equivalent of 2,600 computers 17 months to crack. Commercial grade products use 163-bit keys, which are roughly one hundred million times harder.

The acquisition of Certicom by BlackBerry-manufacturer RIM in February 2009 should also push the area forward, according to Callas: “This should make licensing easier. A larger corporation such as RIM should be more willing to license IP such as this. With a smaller company it may be the only asset it has, so it is less keen to see it distributed. Additionally, the US and UK governments, and the banking industry, are encouraging businesses to begin moving in this direction, and I would expect to see adoption within the next five to seven years.”

Axway's Elgamal was less bullish over the newer technology however: “Elliptic curve cryptography is far more efficient, computationally, and this has made it a popular talking point. However, the entire internet uses RSA keys currently, so replacing RSA would be a tough operation. The two main issues are – do we really need shorter keys (that elliptic curve would offer), and is there a genuine threat to prime factorisation? The last would have to be the case to make replacing RSA feasible or necessary.”

Ulf Mattsson, CTO, Protegrity, believes that the future lies here also: “I certainly see RSA as having a medium-term future, simply by increasing the key-lengths as computational speeds increase. ECC looks to be the next step. I believe that the existing patent issues will be sorted out in the near future. While Certicom is the leader in this space, Sun has taken the open source route and released a variety of IP in this area also. Additionally, ECC is particularly good for use on smaller devices, such as mobile handsets and netbooks.”

Understandably, compliance has proven to be a huge driver in the encryption market. Callas believes that it's the greatest single factor at the moment: “PCI compliance is the main driver at the moment. Although it's not a new standard, we're seeing some of the larger compliance ‘milestones' reached this year, and this is driving a lot of businesses very hard. Failure to comply is not immediately destructive to a business, but the financial incentives are particularly potent: we're seeing a lot of businesses get very ‘incentivised' by them.

“Additionally, we're seeing a lot of traction resulting from cloud computing. Aside from the current hype, cloud computing is a reality, and a mixture of many technologies and processes that have been maturing for years. It will increasingly become business-critical, and this ultimate loss of perimeter will drive encryption uptake – especially key management standardisation efforts.”

Key management standards are currently undergoing final ratification, and vendors should be shipping products containing them by mid-next year, solving one of PKE's greatest bugbears.

A popular show-stopper for the current status quo is the rush towards quantum technology developments. On the one hand, quantum computing may be able to crack ciphers based on number factorisation (such as RSA) very quickly, and on the other the opportunities for point-to-point ‘uncrackable' quantum encryption could render some current PKE applications redundant. Elgamal is unconvinced: “Quantum computing may well break all factorisation-based encryption, but we simply don't know yet. It holds great promise, but isn't a practical threat at the moment.”

Callas took a similar line: “Quantum computing is a potential threat to current cryptography, but that said we're not looking at the sudden arrival of quantum computers – that's a process that will take time and will happen gradually, just like any new technology. Compare this to transistors – when they were first developed there was a period of adoption for non-serious uses, such as radios and hearing aids, before they were accepted as being viable. Then computers were designed. We're not yet at this playful stage with quantum computing, so I see it as being a safe bet that we won't see them this side of ten years.

“There are a range of conferences running now devoted to post-quantum cryptography and there are a variety of solutions already at hand that are quantum-resistant, such as lattice-based ciphers which don't rely on prime number factoring as their strength. If we woke up tomorrow and quantum computers were a reality, there are solutions available to mitigate the issues this would cause.”

Unquestionably, quantum technology could cause a revolution in encryption security, as Gertner pointed out: “The really exciting thing about quantum encryption is that current algorithms rely on ‘difficult' mathematical problems for their security. Essentially, solutions are possible, just they would take an unfeasible length of time to calculate. Quantum encryption, on the other hand, is based on the behaviour of particles, and as far as we know, this is unbreakable under our understanding of quantum physics – it's not just difficult, it's impossible! That said, it's not entirely attack-resistant, as interception is possible.”

Interception in an environment is possible due to the engineering of the technology – not all photons will arrive as they were sent, leading to a constant error level, just as online protocols resend packets lost in transmission. Only if an interceptor raised this error rate enough to raise suspicion would they be caught.

While quantum technologies promise much but can deliver little practical value today, many think that technological barriers are not the greatest issue we have to deal with. Both Elgamal and Gertner believe the greatest challenge that lies ahead is in the engineering, rather than the mathematical arena.

Said Elgamal: “The real issue in the encryption industry is that we haven't prepared very well for the future. Changing between algorithms is possible, but requires a lot of steps to perform. If, or when, the world decides that RSA is no longer viable, there will be a huge amount of work to do to change it! We're seeing an increasing stream of compromises – for example, the MD5 hash – of algorithms we had thought were secure. Now there's a scramble to replace MD5, and it's not a graceful way to migrate. We need to focus on a few simple solutions to this issue, but that requires investment in internet infrastructure, which isn't popular.”

Gertner also believes that engineering is the key: “The encryption methods we have now work, and future ones hold great promise. However, many of the issues that businesses face when implementing PKE solutions are not related to the technology, they are around effective key management. We need to use existing technology to engineer solutions to these issues, rather than focus exclusively on the future.”

While the world has changed much since Bacon's time, the importance of information has not. Clearly, as computational power increases, the strength of ciphers used to protect valuable data will have to increase also. It is also likely that quantum technology will have a significant impact on computing when it eventually makes it out of the laboratory and into the real world. However, this change is likely to occur gradually rather than suddenly, and the ability to react to technological change must be built into current and future PKE implementations. Without the ability to keep pace with this most technical of fields, any system will have a very finite lifespan.

The future

Everyone agrees quantum technology is the next big thing in encryption – but what is it?

There are two ‘quantum' technologies that affect the future of encryption in subtly different ways: quantum computing and quantum encryption.

Quantum computing former involves computer processors that use atomic, rather than electronic circuits: they use lithographic techniques to create chip components that are only a fraction of a micron.According to Moore's law, processor power doubles every 18 months, and if this continues, then the speeds enabled by ‘quantum' processors will be required within the next ten years.

The theory behind this is based around the concept of qubits. Instead of current computer bits, which can be in one of two states (0 or 1), qubits can be in either state at the same time. This is a ‘superposition' and it is by exploiting these that a quantum computer gains the edge. For example, in a classical computer, a register of three bits could store only one out of eight different numbers at a given moment. However, a register composed of three qubits could store all eight numbers in a quantum superposition. This essentially allows quantum computers to perform calculations in parallel, which has a huge impact on the execution time and memory required.

One area that quantum computers would have an enormous impact in is when factoring large numbers. However, it is the difficulty of factoring large numbers that current accepted encryption standards, such as RSA, rely on for their strength against brute-force attacks. It is speculated that once a quantum factorisation engine is built, all current cryptographic systems will become insecure. However, although scientists have managed to build simple logic gates of two qubits, adding more has proven difficult.

Quantum encryption is also based on exploiting the behaviour of particles at a quantum level. The currently accepted system uses photons transmitted via fibre-optic cable. A photon's polarisation can be measured in three ways: the horizontal axis, the diagonal axis, and the circular axis. However, the more that is known about one of these, the less that can be known about the others. If an exact measurement is made of the diagonal polarisation, nothing about the other two can be known. This allows a string of photons to be used to make single-use, inherently self-destructing keys.

Using photons to make up qubits, quantum encryption uses one of two main techniques to ensure security – measurement protocols, or entanglement protocols. In the former, because measuring an unknown quantum state changes the state, it is impossible to intercept the message without changing it, making it clear that the interception has taken place. In the latter, the quantum states of two separate objects are linked or ‘entangled' so that they reflect each other's state. If one is measured, the other will change too. This also allows any interception to be uncovered. Additionally, the system potentially offers unlimited future security, as once the photon-string key has been measured it ceases to exist, making retrieval and future impersonation impossible, as well as making even unlimited computing power useless.