This is the fourth in a series of blogs that I will be writing for everyone to get a basic understanding of this immensely important research field which is poised to become mainstream in a few years and significantly impact our daily lives.
In my previous blogs, I gave a high-level view of some fundamental differences between classical computers and quantum computers, and started off with some basic concepts of quantum mechanics which are needed for building quantum computers. While I have not yet completed the foundations, many of us would be keenly interested to understand why we need to use quantum computers at all in the first place, when there are so many powerful supercomputers available for use. Jump in and read on to get some insights.
The fact of the matter is that despite around six decades of constantly enhancing the power of computing systems, there are still some important problems and applications, which are not solvable using the current breed of conventional computing architectures and computer systems. These are mathematically called as difficult or complex problems. In the language of computational complexity, such problems are said to be solvable in non-polynomial or exponential time as the input data size increases [Ref 1]. Since there is a need for huge amount of memory and/or computation time for useful data sizes, current supercomputers will take tens or hundreds of years to solve such problems [Ref 2].
Some of the problems which are not currently tractable, include the chemistry and molecular dynamics simulations to support the design of better ways to understand and design chemical reactions, ranging from nitrogen fixation [Ref 3] as the basis for fertilizer production and design of pharmaceuticals [Ref 4, 5]. Problems in materials science such as finding compounds for better solar cells, more efficient batteries, and new types of power lines that can transmit energy without any losses [Ref 6]. Finally, Shor’s algorithm [Ref 7], which harnesses quantum computing approaches to efficiently factor large numbers, could make current crypto-systems vulnerable to cyberattacks and eavesdropping (I will be talking about Shor’s and Grover’s algorithms in future blogs).
Quantum Computers have the potential to speed up the computing time so such an extent that the problems and others similar in complexity to the ones I have mentioned above, would become solvable. For example, on a classical computer it would take millions of years to find the ground state energy of a large molecule precisely, or to crack the encryption algorithm that secures today’s internet traffic or the security of the blockchain. On an appropriately sized quantum computer such problems could be solved in minutes or even seconds [Ref 2].
So, where are we in this journey of building quantum computers and when will they start making a difference in the everyday scheme of things? At this juncture, we may claim that an inflection or a tipping point has been reached. What this means is that in the past 4-5 years, scientists and industry have been able to develop small and intermediate-scale quantum computers in their labs [Ref 8, 9]. Prof. John Preskill, a very senior scientist at the California Institute of Technology, has coined a term called NISQ, which stands for Noisy Intermediate Scale Quantum [Ref 10]. This refers to a class of quantum computers which are currently being built and which will be developed in the near foreseeable future. These are systems with 20-1000 qubits which are highly error prone. The current technology does not have the wherewithal to perform sufficient error correction. To give a feel of error rates, in current (NISQ) quantum computers error rates are of the order of 0.1 % to 1%. An error rate of 0.1% means that on an average, for every thousand instructions that are executed on the computer, there is 1 error.
The below table is from Google research labs [Ref 11], which shows the number of qubits on the x-axis and the error rates on the y-axis. For effective quantum computers, the error rate cannot exceed 10-2 which is the threshold (that is error rate should be not more than 0.01%). Once we are able to develop quantum computers with more than 100 qubits (the blue area at the bottom of the graph) some applications which may be computationally more efficient than classical computers could be developed. To be able to achieve a significant benefit over current classical systems, we need to develop quantum computers with 10000 or more qubits and error rates less than 10-4 as shown in the green area.
There is a huge amount of R&D being done at a global academia and industry scale (with China, USA and Europe leading from the front) to build quantum computers beyond the NISQ and develop fascinating and novel applications which we may not even have any idea about and which we cannot develop on current classical computing systems.
I look forward to sharing some more interesting stuff in my later blogs…….
- https://cra.org/ccc/wp-content/uploads/sites/2/2018/11/Next-Steps-in-Quantum-Computing.pdf (Next Steps in Quantum Computing: Computer Science’s Role)
- https://arxiv.org/abs/1605.03590 (Elucidating Reaction Mechanisms on Quantum Computers)
- https://arxiv.org/abs/1808.10402 (Quantum computational chemistry)
- https://arxiv.org/abs/1706.05413 (Quantum Information and Computation for Chemistry)
- https://www.nature.com/articles/s41467-017-01362-1 (Mixed-quantum-dot solar cells)
- https://arxiv.org/abs/quant-ph/9508027v2 (Polynomial-Time Algorithms for Prime Factorization and Discrete Logarithms on a Quantum Computer)
- https://www.nature.com/articles/nature08812 (Quantum Computers)
- https://www.pnas.org/content/114/13/3305 (Experimental comparison of two quantum computing architectures)
- https://arxiv.org/abs/1801.00862 (Quantum Computing in the NISQ era and beyond)
- https://www.tomshardware.com/news/google-72-qubit-quantum-computer,36617.html (Google Unveils 72-Qubit Quantum Computer with Low Error Rates)