IP Expo: Where IT & physics collide - the relevance of quantum computing

News by Tony Morbin

Brian Cox explained that the use of quantum computing for factoring large numbers for cryptography was so effective that it makes classic cryptography redundant.

In the opening Keynote presentation of IP Expo, Professor Brian Cox's  lecture really delivered on being both engaging and informative, and while it straddled the Space-Time continuum and ran from the Big Bang to Newton, Einstein, Hawking and beyond, its primary focus was cosmology.  But after some very cool simulations of colliding black holes demonstrating energy fluctuations that created waves in the density of particles in space,  providing a pattern of circles of galaxies caused by ripples in space, he finally dipped into its relevance for cyber-security.

In an explanation /demonstration of Einstein's theory of relativity, Cox used the example of holding two horizontal mirrors with a light beam bouncing vertically between them.  Obviously the beam travelled at the speed of light.  By walking from one side of the stage to the other with the beam bouncing up and down, to the moving holder of the mirrors, the beam continued to go up and down in vertical lines.  To the  observer in the audience, the light went diagonally from one mirror to the new position of the other, and diagonally back to the next new position of the first, forming a V, and travelling a longer distance than it did for the holder of the mirror, and thus it took longer (and so was slower) to travel from one mirror to the other for the static viewing audience than was for the moving holder of the mirror.  And both measurements were different but both were correct. Similarly, if we bounce a ball on Earth it appears to go up and down, but it is not returning to the same position in space due to the rotation of the Earth at 18 km per second.

It is this ability to be alternate states simultaneously that is of interest in crypotography and cyber-security. 

In quantum computing a quantum bit, or qubit, is a unit of quantum information—the quantum analogue of the classical bit. The information itself has a physical presence. It has two states such as the vertical polarisation and horizontal polarisation of a photon.

250 qubits in an engaged system would require 10 (80) classical bits –  equal to the number of atoms in the known universe of two trillion galaxies  – so a very big number.

Talking to SC Media UK after the presentation, Cox explained that the use of quantum computing for factoring large numbers for cryptography was so effective that it makes classic cryptography redundant.  And when used to transfer encryption keys, if the message were to be intercepted and looked at, it would be detected, and in fact this would change the message itself, hence making it impossible to hack based on our current knowledge.

Also after the main presentation, in discussion Cox dipped into the topic of AI, agreeing that it was currently probably at the level of simulating a cockroach rather than a human brain – though – while nothing to do with cyber-security, he went on to say that if it were ever possible to simulate human experience of the world, there would be no reason to dispute the idea that we are all a simulation.  Which throws into question the not just the relevance of quantum computing, but of everything.


Find this article useful?

Get more great articles like this in your inbox every lunchtime

Video and interviews