The author is ScIU guest writer AJ Rasmusson, a graduate student in IU’s Department of Physics.
Tech companies are going big in a microscopic way, pouring millions of dollars into a new form of computing: quantum computing. Quantum computers will revolutionize drug research, material discovery, and artificial intelligence by solving complex problems in a new way. To understand this, let’s review how normal computers solve problems and compare this to how a quantum computer would do it.
Today’s computers use billions upon billions of 0’s and 1’s called bits to represent information like numbers, words, images, etc. To watch a movie or simulate life-saving chemical reactions in medicine, a processor in the computer takes a group of bits and modifies them according to programmed instructions. For example, to watch Infinity War, your computer processes more than 16 billion bits. By repeating this process very quickly, your computer can turn a file of 0’s and 1’s into moving images on your display or store answers to complex math equations in a new file.
Quantum computers take a different approach to information processing. Instead of solving a problem one outcome at a time, a quantum computer computes every possible outcome simultaneously. To understand the power of that statement (and what it even means) let’s consider an example.
Imagine you want to find the deepest place on Earth and a single point on the planet can be uniquely identified by a group of bits, like GPS coordinates. To move the point toward low ground, the processor modifies the group of bits so they represent a neighboring point that’s further downhill. This is repeated until the point reaches a minimum. But is this the lowest minimum? To truly know if that minimum is the deepest place, the computer must search the whole planet. It must start at every point, go downhill, and then find the lowest minimum among all the discovered minimums. That’s a lot of starting points! If each point was enlarged to a square mile, the computer would need to try 196.9 million points!
A quantum computer would solve this problem 196.9 million times faster. Quantum computers have a peculiar group of bits called qubits that can represent all possible points simultaneously. The quantum computer processes these qubits so all the points go down hill simultaneously, and thus it finds all the low points in one computation! We can then have the quantum computer compare all the low points and output the lowest one. If each point is a square mile, the quantum computer found the Challenger Deep (Mariana Trench) in one attempt—196.9 million times faster than the normal computer! This advantage is called quantum parallelism because the quantum computer solves for each point in parallel (meaning at the same time) as every other point.
So, what are these qubits, and why do they let a quantum computer solve this problem in parallel instead of one point at a time?
Qubits are quantum bits. They are made from tiny particles, or groups of particles, where quantum mechanics dominates: think electrons, atoms, and photons. A particular property of these quantum particles (e.g., energy or magnetic moment) is mapped to a 0 or 1—just like normal bits. But, qubits have a trick up their sleeve. They can also be 0 and 1 at the same time! This is called superposition.
Superposition is analogous to a spinning coin. While a coin is spinning, is it heads (0) or tails (1)? Well, we know that “measuring” the coin will give heads (0) 50% of the time and tails (1) the other 50%. Since the spinning coin can give a heads (0) and tails (1) result, the spinning coin is, philosophically speaking, both heads and tails at the same time! Analogously, a qubit in superposition is both 0 and 1, but when measured, the qubit will read either 0 or 1. As a side note, rigorous experimentation has shown qubit superposition is truly in both states at once —this has fascinating philosophical implications about reality that I will not cover here.
Let’s review why superposition allowed the quantum computer to solve the lowest location problem in parallel. First, let’s say we can uniquely define a location on the planet with three bits (very unlikely but that’s beside the point). That means the normal computer had to process 8 different combinations of 0 and 1 (000, 001, 010, 011, etc.). For the quantum computer, say we start off with 000. Next, we modify the qubits so that each qubit is now in a superposition of 0 and 1. If we were to measure the qubits at this point, we’d find 000 one eighth of the time, 001 one eighth of the time, 010 one eighth of the time and so on—all the way to 111. Since all eight possible points are possible outcomes of a measurement, the qubits simultaneously represent all eight points! Now that all eight points are represented simultaneously, we modify the qubits further to travel downhill. Once they all hit a bottom, we measure the qubits and they (hopefully) output the lowest value!
There a couple of important side notes to make about the quantum algorithm we just walked through. First, the qubits cannot be disturbed whatsoever during the processing. If they are unintentionally bumped into—by nearby particles—they will “collapse” from representing all points to just one point and thus ruin the parallelism. Second, the algorithm must be very clever. Near the end, the qubits simultaneously represented all the low points, so when we measure the qubits, there is a probability of getting a low point that is not the lowest point. How do we increase the probability of measuring only the lowest point from the qubits? These concerns and questions are at the bleeding edge of the quantum information science field. In fact, IU is starting a “Center for Quantum Information Science and Engineering” to help answers some of these and other related questions. As quantum computers and the quantum information science field mature, more algorithms and strategies will emerge that leverage quantum parallelism to solve important problems faster than any supercomputer ever could.
In this article, we covered how superposition gives quantum computers a serious advantage over normal computers. There is a second advantage called entanglement—stay tuned for more on that.