The world of quantum computing explained in layman and Feynman terms.

The science behind the technology it enables, the places where it is and will find use are collated in a technical yet lucid way similar to quite Feynman’s maxim itself:

“Nature isn’t classical, dammit, and if you want to make a simulation of Nature you’d better make it quantum mechanical, and by golly it’s a wonderful problem, because it doesn’t look so easy.”

Background

On May 7, 1981, renowned American theoretical physicist and Nobel laureate Prof. Richard P. Feynman gave the keynote talk at the 1st Conference on Physics and Computation at the Massachusetts Institute of Technology.

He addressed the problem of simulating nature in a classical computer, under the usual constraints   of a classical computer: Information in the form of “bits” is discrete (1 or 0) and hence, all physical parameters required to demonstrate natural phenomenon, namely space and time (which are continuous in nature) should also be discrete. While this approximation, if taken to be reasonably small, works well for simulating systems with classical laws of physics, the computer fails in case of systems that demand a quantum mechanical explanation.

To quote Feynman,

Nature isn’t classical, dammit, and if you want to make a simulation of Nature you’d better make it quantum mechanical, and by golly it’s a wonderful problem, because it doesn’t look so easy.

How a quantum computer works?

About 28 years later, IBM released the world’s first “circuit-based” commercial quantum computer, “IBM Q System One”, thereby taking humanity closer to Feynman’s vision. Now the question arises, what is a quantum computer, and how is it different from a classical one?

A classical computer uses ‘bits’ to carry information. A bit can assume either the binary value ‘1’ or ‘0’. In a quantum computer, the counterpart of the ‘bit’ is ‘qubit’ (or, Quantum BIT).  A quantum computer maintains a sequence of qubits which either exist at state ‘1’, or ‘0’ or in a superposition of both the states.

This property of superposition of qubits arises from the concept of “Quantum Superposition, “which says that a quantum mechanical system can exist in several separate quantum states at the same time. For a sequence of n ’qubits’, a quantum computer can be in any superposition of 2n states at one time, whereas a classical computer can only be in one of these states at one instant of time.

While this might seem to imply that quantum computers can hold exponentially more information than their classical counterparts, but the existence of the computer in any of the 2n states is a probabilistic one: when the final states of the qubit is measured, they would be found in only one of the 2n possible states (This is similar to the collapse of a wave function due to a measurement). It is generally incorrect to think of a system of qubits as being in one particular state before the measurement.

The qubits are in a superposition of states before any measurement is made, which directly affects the possible outcomes of the computation. One might ask, how are these ‘qubits’ physically realised in a quantum computer? -  a qubit is essentially a two state quantum device, which can be, for example, an electron’s spin (which can be taken up, or +½ and down, or -½) or the polarisation of a single photon: whether it is vertically polarised or horizontally polarised. Qubits developed in laboratories till now have been mostly manufactured from the electron or the nucleus of a single phosphorus atom.

After discussing at length the intricacies of a quantum computer, a question comes up- how a quantum computer is different from its classical counterpart with   regards to performance. An overtly simple way of looking at the performance of a quantum computer would be to say that since it can exist in an exponentially higher number of states than a classical computer, it can perform an exponentially more number of calculations within a given time span.  

A more scientifically accurate explanation will be that a quantum computer itself isn’t faster-  it’s just a different model of computation than a classical computer, and the inherent properties of such a model allow the quantum computer to run algorithms which are asymptotically faster than classical algorithms for the same computational problem.  

For example, considering the traditional problem of prime factorisation of a positive integer, a 512-bit number will be factored in about a year’s time (3.154 x 107 seconds) by the fastest known classical algorithm, whereas the corresponding quantum algorithm (namely, Shor’s algorithm) will take the time of the order of a second. At the same time the pitfall of this different model is that only very specific algorithms can be developed to fit the model and actually be desirably faster than a classical solution.

Quantum Computers- The overwhelming odds

While existing quantum algorithms certainly provide a huge jump in performance, quantum computers are yet not of significant commercial value for several reasons. One of them is its highly specific nature, as mentioned earlier, which is an inherent property of the model of computation. Also, any industrially applicable   solution to a problem has to be sufficiently low cost, which a quantum computer isn’t.

There are several reasons for this: for a quantum computer to be of any use, there should be minimal ‘noise’ (erroneous random signals) in the processing of information -which translates into the best possible orderly arrangement of qubits. A phenomenon called “DE coherence” prohibits this, and in order to reduce the effect of noise these systems have to be kept under super-cooled conditions (The quantum processor by D-Wave has to be kept at a temperature below 0.02K, which itself is below the temperature of the background radiation in the universe).

In addition, some systems  deploy “error correction” codes which reduce DE coherence, but at the expense of using up qubits from the system. Scope in the industry: In spite of the daunting adversities that lie ahead of us in physically realising a quantum computer, once created, it can achieve what is now touted as “Quantum   Supremacy.” Probably the most interesting application thought of till now would be in the field of cryptography.

Several modern public key cryptography systems, that secure web pages and encrypted mail and even online currency, depend on the simple fact that given two prime integers large enough (for example, 300 bits each), we can create a number with those two integers as its prime factors, and even the fastest classical factorisation algorithm will take a very long time to decrypt and break the security codes.

But if quantum computers were to exist, one could easily deploy Shor’s algorithm to solve the problem in exponentially less time, thus leaving data all around the world extremely vulnerable. This would have serious ramifications for cyber security and privacy all over the globe. However, there do exist classical cryptography algorithms, though less popular than the one stated above, that cannot be decrypted by quantum computers.

The quest for quantum computing has one of its roots in Feynman’s keynote speech at the 1st Conference on Physics and Computation, as a form of a simulation machine for real-world quantum phenomena. It   is   expected   that   the   simulation of quantum systems, so far limited by the classical model, will be fully realisable on quantum computers. This would help scientists verify earlier stated hypotheses and unearth some more too.

The fields of research that would directly be benefited are Computational Chemistry, Particle physics and Molecular modelling. Inroads into areas like Quantum Chemistry and Biochemistry might mean the creation of newer, more powerful drugs to combat diseases. Materials simulation applications are already being explored, with the newest advents coming from laboratories such as Los Alamos National Laboratory, where scientists have used graph based methods for Quantum molecular dynamics simulations, and the results they have obtained using D-Wave’s quantum   systems, reportedly “equal or out-perform current state of the art methods.”

A particular field of computer science that stands to benefit the most with the advent of quantum computers is Artificial Intelligence. Any application of Machine learning requires ‘training’ the code with a large data set of sample inputs, so that it can ‘learn’ the correct output. Quantum computation  might be able to speed up   the process.

Theoretically, it is possible for a quantum computer to unearth patterns in datasets, even if it is unsorted, because it can access all the data entries parallel at the same time. The D-Wave website claims that researchers from 1QBit have developed a method to perform reinforcement learning using the D-Wave 2000Q system.

Current exploits

All these uses listed above are at least a few decades away from growing out of the laboratories and taking their place in the market. For example, to crack a 2048 bit RSA encrypted code, which is currently the most popular mode of encryption, we need a quantum computer of a billion qubits to do the job reliably, and current state of the art quantum machines, have only about 512 qubits. But even now, quantum computers can be used for some elementary operations.

For example, D-Wave’s current quantum processor has 512 niobium qubits, which can solve, among other things, optimisation problems of less than or equal to 512 variables.  An optimisation problem essentially is finding the most optimal solution to a linear equation of multiple variables times their individual weights. A simple estimate shows that a 270 variable optimisation problem consisting simply of two options per variable:  ON or OFF, we have 2270 possible combinations, more than the number of atoms in the universe!

These types of optimization problems exist in many different domains, such as systems design, mission planning, airline scheduling, financial analysis, web search, and cancer radiotherapy.

Current times in the quantum computing industry are reminiscent of the 1940s and 50s, when first generation  vacuum  tube computers had just entered the industry, nascent, but with a lot of promise.

Sixty years later, scientists feel the need to step up the computation game, to handle enormous amounts of data that was unimaginable to the computer scientists of the 1940s. And in order to do so, the need of the hour is to adopt quantum computation as a new paradigm of thought, both in academia and in the industry.