Introduction
Most of us in IT are occupied with the day to day challenges of digital transformation, and the impact on our business, our customers, and our employees. It is hard to focus on things that may or may not happen ten years down the road. Quantum computing is one of these technologies that has been predicted to be coming in the next ten years for several decades. Quantum computers will have the capability to perform multiple calculations at the same time as opposed to classical computers that can only do one calculation at a time. The compute power of Quantum computers could provide new breakthroughs in science, medicine, materials, finance, energy and all the innovations that can make the world a smarter, healthier, safer world, but there could be a downside as well. One of the well-publicized consequences of this amazing compute power is the ability to crack our current encryption codes and make them obsolete. This will be a major disruption and a major exposure to the security of our public and private institutions. Everything from the internet, to our personal devices, to our national security will be exposed.
How real is the threat of Quantum computers cracking today’s encryption codes?
According to DigiCert it would take a standard desktop computer a little over 6.4 quadrillion years to break a 2048-bit RSA key using factorization. A standard computer works through one combination at a time. However, an “n” qubit quantum computer can work through 2^n combinations at a time. There have been studies that calculate that a 20 million qubit quantum computer could factor and crack a 2048 RSA private key in 8 hours! Today the largest quantum computer from Google is 72 qubits, so there is a way to go.
Several experts make the argument that viable quantum computers are still decades away. Quantum computers use Quantum particles and are very unstable and have to be run at near zero temperature to slow down the Quantum particles and limit energy loss. They point out that experimental Quantum error correction and fault tolerance is still in its infancy, the average error rate of qubits would need to be reduced by a factor of 10 to 100 before a Quantum computer would be robust enough to support error correction at scale.
Several companies have built experimental versions of Quantum computers and have demonstrated them by solving mathematical problems in several minutes that would take a classical super computers thousands of years to solve. Technology companies, research universities, cloud providers, telco’s and financial institutions have all invested in this technology and every day there are announcements of new breakthroughs. Helmut Neven, the director of Google’s Quantum Artificial Intelligence lab, postulates that quantum computers are improving at a “doubly exponential” rate, as opposed to Moore’s exponential rate. If this law holds, a 20 million qubit computer is just around the corner.
It appears that it is not a question of “if” quantum computer, but when.
The development of large quantum computers will eventually happen. It’s time to be thinking about post Quantum computing encryption.
Current encryption codes are based on functions that are easy to do in one direction but difficult to do in reverse, such as the multiplication of two prime numbers. It is easy to multiply two prime numbers and come up with a large number as a product. However, it is hard to start with a large number and determine the prime numbers that created it. That is a process called factorization which becomes increasingly difficult as the numbers gets larger. Quantum computers can do factorization much faster that classical computers and get exponentially better at it as it uses more qubits.
The good news is that security experts have developed post Quantum codes that even a Quantum computer will not be able to crack. There are other methods of encryption which do not depend on factorization and would be resistant to decryption by Quantum computers. It is already possible to safeguard data today against future attacks by Quantum computers. These methods are called Post Quantum Encryption and includes methods such as Lattice crypto-systems, code based systems using error correcting codes and Multivariate cryptography. The problem is that the codes are not yet standardized.
Standardization is imperative since we continue to live in a connected world and not only are private and public key cryptography algorithms needed to secure against both pre and post Quantum computers, but these algorithms must interoperate with existing communications protocols and networks. The United States NIST and Chinese Cryptographic Society are holding their own competitions to find private and public key cryptography algorithms. NIST has narrowed the group of potential post Quantum encryption tools down to a bracket of 26 which are being considered for potential standardization and are requesting that the cryptography community focus on analyzing their performance to get better data on how they will perform in the real world. This second round is also focusing on devices that have limited processor power like smart cards, tiny devices for use in the Internet of Things, and individual microchips that also need protection. Low latency is a key concern as more applications move to real time.
Summary
Standardization of post Quantum computing algorithms are two to three years out, but we need to be thinking about our encryption needs today; assessing the risks, evaluating how we would go about retrofitting the standards when they become available, and monitoring the progress of Quantum computing. If the need to protect data is only required for the next 5 to 10 years you may opt to stay with the current encryption schemes, but if the need extends into the post Quantum computer era, you may need to consider retrofitting the new encryption schemes when they become available, even before the availability of viable Quantum computers. Retrofitting encryption codes to the mountains of data we have today will be a very expensive and time consuming process.
#Blog#Hu'sPlace