Quantum Computing: What is it and is it the next big thing?

Part I of Quantum Computing (Series)

“Nature isn’t classical, dammit, and if you want to make a simulation of nature, you’d better make it quantum mechanical.” - Richard Feynman

Richar Fyenman

What is physics? Physics is one of the most fundamental studies of nature, it strives to be a complete understanding of the very universe around us. It “describes” everything around us by the means of experimentation and analytics. What is the universe? The universe is everything we observe around us (matter and energy) and everything that we don’t observe around us (dark matter and dark energy). So, What is Quantum Physics? Physics always has a ruling paradigm and most experiments during an era are governed mostly by these ruling paradigms. Before the quantum paradigm, the Newtonian paradigm ruled. However, by the late 1800s and early 1900s, scientists began to observe discrepancies in their experiments, the likes of which are inexplicable by the means of Newtonian physics. Therefore, instead of an analytical approach, we devised a statistical approach to understand nature. Suddenly we began to realize that our nature isn’t continuous but quantized and that particles can exist in multiforms (photons as light and wave). Turns out, we cannot keep dividing our nature into smaller and smaller pieces. We eventually run into a wall of the smallest possible energy levels, a well defined extremely minuscule quantum. Why do we believe in it? The experiments we have done ever since have been one of the most successful experiments in the history of science. We have achieved measurements up to a precision of 1/10,00,00,000 (Quantum Electro Dynamics). These are extremely complex natural phenomenons and when they get observed with such precision, we have the responsibility to consider it as a valid hypothesis.

Max Planck

The Magic of Quantum Computing

Let’s picture a classical computer which can process 10 bits of data. You can send in any combination of 0s and 1s, as long as long the length of that sequence is 10 units (bits). Now, per processing-cycle, one can only send one particular sequence. Therefore our approach is limited to being sequential. One can already imagine how larger data sequences can take much longer to get processed by a CPU. This is why we have historically tried to make CPU’s larger (more complex) and faster. However, we are reaching a limit defined by Moore’s Law. This is why we need a shift in paradigm and this how we have ended up with quantum computing.

Superposition

A bit can only hold a 0 or a 1 but never a combination of both. Imagine it as a switch that can only be turned on or off but never be balanced in between the two states. Quantum-bits or qubits can do exactly that. Using simple operations (Hadamard Gate) we can produce a state that is both 0 and 1, 50% each. How cool is that? We now have a switch that can be both on and off at the same time.

Applying Hadamard gate to a qubit (Wikipedia)

In the picture above, a qubit with a spin-up state is |0> and a qubit with a spin-down state is |1>. These are the two computational bases of quantum computing. Notice how the Hadamard operation on |0> creates another state with equal superposition of both the basis states. These states are known as the Bell States. When we re-apply the same operation on a Bell state, it produces the original state.

Entanglement

This is one of the strangest phenomena that exist in nature. Quantum entanglement is an occurrence in nature where a pair or group of particles are produced that are correlated to each other, i.e., they act deterministically w.r.t each other. We can entangle two qubits to produce |00> or |11>, where the measurement of one qubit determines the measurement of the other one. We can entangle more than two qubits as well, however, there are physical limitations to doing so and at the moment we can only entangle ~20 qubits. Entanglement along with superposition allows us to do superdense coding. We can now send two-qubit worth of information using only just one qubit. Since the second qubit will act deterministically w.r.t. the first qubit, only one measurement is required.

Noise and NISQ

When we read about quantum algorithms that run on state-of-the-art quantum computers, we mostly talk about fault-tolerant quantum computers with millions of qubits that do not exist. Individual qubits are noisy and corrupted by the environment around it, which is why they need to be cooled to an order of ~ 10 mK so that heat does not become an overwhelming source of noise. Ideally, we would like to have enough qubits to be able to correct for these errors but we do not. Instead, we have Noisy Intermediate-Scale Quantum (NISQ) computers. Noisy because we don’t have enough qubits to spare for error corrections so we settle with noisy results and intermediate-scale because we expect to have many more qubits in the future. Think of a time when we had 8- or 16-bit processors in our computers.

Rigetti is a California based Quantum Computing company that currently offers a 19-qubit QPU.

Types of Quantum Computers

This infographic from Carl De Torres for IBM Research explains the idea behind different types of quantum computers very well:

Quantum Annealers: These kinds of quantum computers mainly provide a solution to a problem by finding the global minimum of a given objective function — similar to simulated annealing. D-Wave Systems have created a 2048-qubit quantum annealer, which is currently one of the largest systems available. These qubits are not quite as error corrected as other quantum computing models but they are designed to do one thing very well — optimization. They can converge to a global minimum much faster than their classical counterparts. Scientists have been able to use D-Wave systems for the purpose of satellite optimization, cancer research, traffic flow optimization, unsupervised learning, and more.

Analog Quantum Computing: When we talk about NISQ, we talk about this approach. It is also referred to as the gate model quantum computing. Here, we have multiple qubits in a system and we can manipulate and entangle each of them by the means of quantum logic gates, analogous to classical logic gates. These computers are closer to universal quantum computers than quantum annealers. The advantage of such a computer doesn't lie in the fact that they can do certain things faster but they can do certain things that a classical computer cannot, such as quantum simulations. Rigetti and IBM currently have such systems available in the North American market.

Aside, you can experiment with an IBM 5-qubit system available to the public.

Universal Quantum Computers and Quantum Supremacy: This is a distant utopia that we dream of and these are the kind of system that we strive to have. Such systems can do anything the above two systems can and much more. These devices are fault-tolerant (at least to a significant extent) and can perform certain operations that classical computers can’t due to their design. It can simulate nature at it is and it will open doors to new paradigms of physics, unimagined by our Newtonian minds. The ability of quantum computing devices to solve problems that classical computers practically cannot is known as Quantum Supremacy. However, certain researches have shown that such supremacy can be achieved even on the aforementioned adiabatic quantum computer, only if it has enough qubits to do so. Google claims to be the closest to doing so.

Is it the next big thing then?

Yes, it most definitely is. We have made tons of progress so far and we have achieved some amazing results that show quite a lot of potential. We have already shown how quantum computers are capable of solving certain problems that classical computer can’t — whether it is complex chemical simulations or quantum-based cryptography. It is only a matter of time before we reach a point where we have millions of qubits at our disposal. It can be anywhere from 10 years to 20 years. Until then, we have NISQ era quantum computers to work with and they are good enough to formulate a long term approach.

“There’s plenty of room at the bottom.” — Fyenman (This simple statement was the title of a talk Feynman delivered in 1959, widely regarded today as the original inspiration for the origin of nanoscience and nanotechnology. Feynman recognized that miniaturization in recording information was limited only by the size of atoms, and he even imagined that atoms could someday be manipulated individually. And they have been.)

Interested in learning more about quantum computing? I will be publishing a sequel to this article, exploring the concepts in more detail. Meanwhile, if you are interested in learning more about this, I highly recommend you to check the following out:

  1. Quantum Computing for the Determined, by Michael Nielsen (Video Series)
  2. A Quantum Machine Learning course on edX.org by Peter Wittek (Link).