An Introduction to Quantum Computing

Examining the technology revolutionizing computation.

Alex K
16 min readApr 12, 2021

Heard of quantum computing?

You might have seen the phrase thrown around in science fiction movies. Maybe headlines about the latest breakthrough in the field, perhaps laced with baffling buzz words like “superposition” and “entanglement”, have caught your attention. But beneath the layers of hype and noise, how do quantum computers actually work, and what implications do they have for the future of computation?

The Qubit

Today’s “classical” computers encode information in bits — binary digits representing a logical state with just 1 of 2 possible values: 0 or 1. Everything we use computers for today, from Twitter and Tmall to Zoom and Amazon, are comprised of streams of electrical signals representing an incredibly long string of 1s and 0s.

The basic computational unit in quantum computing, by contrast, is the qubit (a portmanteau of “quantum bit”). Qubits are fundamentally different from conventional bits, insofar as they are not confined to being in a state of either 0 or 1— qubits can take any combination of values in between. Crucially, quantum systems can exist in several of these quantum states at the same time — a simplistic depiction of this phenomenon, dubbed superposition, is provided below:

Bra–ket Notation

In order to understand why qubits can be visualized as such, we must develop a mathematical model for the intrinsic angular momentum carried by elementary particles, called “spin” — this is to say, certain particles act as though they are spinning, without necessarily spinning. This mysterious behavior arises from special relativistic effects within quantum mechanics.

“Imagine a ball that is spinning, except it is not a ball and it is not spinning.”

Any spin state of an electron can be written as a linear combination of spin-up and spin-down. We represent these two orthonormal basis states with |0⟩ and |1⟩ respectively. These “ket”s correspond to the following basis vectors:

Quantum Indeterminacy

Any quantum bit wave function can be expressed as a two-state linear combination, each with its own complex coefficient. This is expressed below in bra-ket notation, with α and β as the coefficients (often referred to as “amplitudes”) of both of the states:

It is important to grasp that isolated quantum systems exist as probabilities, rather than actualities, giving particles the property of being things that might be; this fundamental condition of existence stands in stark contrast to Newtonian physics, in which things either are, or are not. Accordingly, open quantum systems do not possess fixed properties (such as mass, position, and spin) until those properties are measured. Strictly speaking, quantum particles do not exist until observed.

Although qubits collapse into one of two deterministic states upon measurement, the probability of falling into state 0 or state 1 can be modeled mathematically. Indeed, the probability of the state is said to be directly proportional to the square of the magnitude of its coefficient. The non-negative reals |α|² and |β|² thereby correspond to the probabilities of the system being in state |0⟩ and |1⟩ respectively. Moreover, the probability rule of sum gives us the following relationship:

The Bloch Sphere

Any complex number can be represented in both rectangular and polar form:

β, also complex number, can similarly be expressed in polar form. The polar representations of both α and β can therefore be substituted into the aforementioned equation describing the general qubit state, as follows:

A factor of (e to the power of iφ subscript 0) can subsequently be factored out. Since this global phase is said to be “physically irrelevant”, it can be dropped from consideration. Moreover, (φ subscript 0) − (φ subscript 1) can be rewritten simply as φ. The steps corresponding to these 3 transformations are depicted here:

Finally, we can leverage the normalization constraint outlined above, and express (r subscript 0) and (r subscript 1) as the cosine and sin of (θ÷2) respectively, as shown below:

These terms can then be substituted back into our initial representation for a given quantum state, yielding the following representation:

The system has therefore been described with 2 real parameters: θ and φ. Qubit states can therefore be modeled as unit vectors in three-dimensional space, with θ and φ as the polar coordinates. Accordingly, the state of a quantum bit can geometrically represented as a point sitting on the surface of a unit sphere in 3D space — this illustration is known as the Bloch sphere:

Multi-Qubit Systems

In the same way that classical computers would not be able to achieve much with just a single bit, the functionality of full-fledged quantum computers depends on having processors with multiple qubits. Indeed, since 2 classical bits can produce 4 different states (00, 01, 10, and 11), 4 complex coefficients are required to describe the state of a two-qubit system; these amplitudes are stored in a four-dimensional vector. Furthermore, the squares thereof sum to 1 (per the aforementioned normalization condition), as seen here:

Additionally, the collective state of two separated qubits is given by the tensor product of their vectors, which is defined as follows:

This rule can be used to describe the collective state of any number of qubits. Crucially, a computer with n qubits will need to keep track of (2 to the power of n) quantum state amplitudes — these vectors increase exponentially with the number of qubits, rendering physical memory the limiting factor.

Quantum Entanglement

In a textbook case of “all cows are cattle but not all cattle are cows”, it should be noted that although the tensor product of two single-qubit states must always form a two-qubit state, not all two-qubit states can be written as the tensor product of two single-qubit states. The following oft-mentioned example authenticates this observation:

Two qubits which cannot be written as the tensor product of two single-qubit states are said to be “entangled”. By virtue of this very condition, the information held by the quantum state is not confined to either of the individual qubits individually. Rather, the two single-qubit states share spatial proximity in a way such that the information is stored non-locally (applying not only to space, but also to time) in the correlations between them— the quantum states must therefore be described with reference to each other.

Measurements of a qubit’s spin state influence an arbitrarily distant particle with which the first is entangled.

This quantum mechanical phenomenon leads to correlations between observable physical properties of the systems — that is, changing the state of one of the qubits instantaneously changes the state of the other qubit in question, in a predictable fashion. Quantum computers harness these entangled qubits in a kind of quantum daisy chain. As a result, the addition of an extra qubit produces an exponential increase in the quantum machine’s calculation speed. Despite the fact that entanglement has continued to mystify the scientific community, most researchers in the field of quantum mechanics today argue that the exponential growth associated with this phenomenon is necessary to realize quantum computing systems.

Quantum entanglement is “spooky action at a distance” − Albert Einstein

Quantum Logic Gates

An IBM quantum circuit illustrating the non-local implementation of a CNOT gate.

Just as classical computers depend upon classical logic gates to manipulate 1s and 0s in a way that enables the execution of complex operations, quantum circuits (the most common model for quantum computation) operate on quantum logic gates. The effect of a specific quantum gate on a particular quantum state can be ascertained by multiplying the vector representing the state by the matrix U corresponding to the logic gate. This simple, generalized procedure is depicted here:

Unlike many classical logic gates, quantum logic gates are always reversible — that is, they contain a unique input associated with a unique output, which ensures that information is not erased while the logic gate is acting. Moreover, having run a computation forward to produce an outcome, once that very outcome is copied, the entire computation can be undone — this enables the recovery of all the energy expended in the computational process.

The mathematical basis underlying this property is that a n-qubit reversible quantum logic gate can be described as follows:

Quantum Decoherence

Quantum systems, particularly those of great size and complexity, are incredibly sensitive to environmental noise. This noise decays and eliminates the quantum behavior of particles, just as friction causes the loss of energy in classical mechanics. This phenomenon is described in quantum mechanics as a loss of coherence — that is, a definite phase relation between different states, which is necessary to compute quantum information encoded in quantum states, no longer exists.

On the flip side of the coin, a perfectly isolated quantum system would indeed maintain coherence indefinitely, thereby protecting quantum information from decoherence-induced errors. However, it would be impossible to investigate or manipulate such systems, or the starting assumption pertaining to perfect isolation definitionally would not be true.

“The greatest hurdle in using quantum objects for computing is to preserve their delicate superpositions long enough to allow us to perform useful calculations” − Andrea Morello

Quantum Error Correction

In order to achieve fault-tolerant quantum computation, quantum error correction (QEC) methods, which arise from the intersection between quantum mechanics and the classical theory of error correcting codes, are leveraged.

The simplest quantum error correcting code is illustrated above. In this example, Alice (the source) wishes to transmit a single-qubit state |ϕ⟩ = α|0⟩+ β|1⟩ via a noisy communication channel which introduces (σ subscript x) errors (|0⟩ ↔ |1⟩) randomly. As seen above, Alice prepares two further qubits, which are represented by small yellow circles. Alice then operates two controlled-NOT gates, in order to encode her initial single-qubit into a joint state with the additional two single-qubits. Finally, Alice sends all three qubits with an α|000⟩+ β|111⟩ state down the noisy channel.

Bob (the receiver) recovers and corrects the joint state by extracting and employing an error syndrome — two classical bits of information, which help him diagnose and rectify the errors in the three received qubits; indeed, these corrections consist of an (σ subscript x) operation applied to either one or none of the three qubits. Bob concludes the QEC process by disentangling one qubits from the others, producing a single-qubit state |ϕ⟩ with probability of success 1 − O(p²).

As summarized in the table above, the probability that Bob fails to obtain Alice’s original single-qubit state is O(p²), whereas it would have been O(p) if QEC had not been used. Accordingly, with just three times as many qubits, and p = 0.001, the error probability is reduced by an impressive factor of ~ 300. However, with 300 times as many qubits instead, and the same p = 0.001 assumption, an error probability reduction by a factor of just ~ 3 is observed — this indicates that more sophisticated error correcting code is required for large-scale quantum computers.

“It must be recognized that large systems are not simply larger versions of small systems” − Rodney Van Meter

Architecture

The architecture of a quantum computer combines classical and quantum components. As showcased in an open-access chapter by Surya Teja Marella and Hemanth Sai Kumar Parisa, quantum architecture can be divided into 5 layers, as seen here:

Each of the 5 layers depicted above can be seen as a functional part of the quantum computer:

  • The Application Layer represents the OS, IDE, UI, and other crucial pieces of infrastructure, thereby enabling the formulation of suitable quantum algorithms.
  • The Classical Layer optimizes the quantum algorithm, compiles it into microinstructions, and processes & returns quantum-state measurements.
  • The Digital Layer interprets microinstructions into signals, and feeds quantum-state measurements as feedback to the Classical Layer for merging.
  • The Analog Layer creates voltage signals, which are sent to the Quantum Layer, facilitating the execution of quantum operations.
  • The Quantum Layer holds qubits and performs QEC.

Quantum Supremacy

For decades, researchers in the field of quantum computing have sought to demonstrate that programmable quantum devices can crack problems that no classical computer can solve in any feasible amount of time (irrespective of the usefulness of having solved the problem), and thereby achieve “quantum supremacy”.

“I proposed the term ‘quantum supremacy’ to describe the point where quantum computers can do things that classical computers can’t” − John Preskill

In October 2019, engineers from Google made history with a landmark claim that they had finally achieved this long-awaited milestone; their 54-qubit (53 of which were functional) Sycamore processor performed a random sampling calculation in just 200 seconds, 1.58 billion times faster than their projection for how long it would take the fastest classical supercomputer to execute the same computation: 10,000 years.

Google claimed quantum supremacy with its fully programmable, 54-qubit processor called Sycamore.

Although a subsequent blog post by IBM, a competitor, cast a shadow of doubt on the validity of Google’s 10,000-year projection, the feat has undoubtedly sparked a fresh era of excitement around quantum computing, alongside explosive interest in the anticipated commercial implications of this emerging technology.

Industry Applications

A February 2020 McKinsey & Company report identified 4 fundamental capacities distinguishing quantum computers from classical computers: quantum simulation, quantum optimization, quantum AI, and prime factorization.

A November 2020 Volkswagen Group report conceived of the same 4 applications for quantum computers.

Four high-potential, specific applications help to paint a richer picture of the potential for quantum computing to disrupt a variety of industries.

1. Cut development time for new pharmaceuticals with simulations

Examining the molecular structure of “targets” is a critical component of the drug discovery pipeline. However, it is almost impossible for even the most advanced classical computers to simulate relatively basic molecules, due to the difficulty in modeling the complex atomic interactions. This limitation has motivated reliance on chemical synthesis to physically create the molecules in question to physically measure their properties — an expensive, process which is riddled with errors, and is consequently time-consuming.

Quantum computers, on the other hand, are said to be “intrinsically well-suited to tackle this problem” — after all, the very interaction of atoms within a molecule, which classical computers struggle to model, is in fact itself a quantum system. Improving the front-end of the drug discovery process with quantum computing has the potential to “dramatically cut costs and time to market” and enable “computational chemists to make new discoveries faster that could lead to cures for a range of diseases”, according to a recent Accenture report.

“In three to five years, I’d expect to see hybrid algorithms for protein design, structure prediction, and quantum value for prediction problems such as QSAR” − Lucas Siow

Indeed, seeking to capitalize on recent advancements in both fields, Boehringer Ingelheim became the first pharma company to partner with Google for quantum computing efforts, in January 2021. Through this partnership, both companies are striving to solve relevant pharma challenges in the beyond-classical regime: simulating the body’s biological mechanics at the molecular level with unprecedented speed, and consequently discovering & designing new life-saving medical treatments.

Biopharmaceutical factory Boehringer Ingelheim is looking to leverage quantum computing for pharma R&D.

2. Solve complex optimization problems with unprecedented speed

Across virtually every industry, businesses face a host of complex optimization problems. Consider Amazon, whose last-mile delivery network is tasked with delivering over 10 million customer packages every single day — how should truck drivers optimize the route they take? Further reflect on the fact that JPMorgan Chase & Co. must determine a borrower’s ability to meet their debt obligations — how are financial practitioners to determine the optimal trading trajectory, arbitrage opportunities, and feature selection in credit scoring, if doing so involves processing trillions of different scenarios?

Needless to say, classical computers cannot handle such scenarios effectively; the process is arduous, time-consuming, and necessitates the isolation of the inputs which one believes to drive performance gains and losses the most— an unavoidable approach doomed to fail. Quantum computers, capable of simulating trillions of scenarios simultaneously, can dramatically narrow the range of possible solutions, rendering computation of the “last mile”, to revive the transportation planning example, trivial — even for classical computers.

“Using nature to solve optimization problems is an old idea. In the quantum setting, it is a surprisingly powerful idea” − Ojas Parekh

Quantum annealing (QA) is a metaheuristic tool which naturally returns low-energy solutions. By sampling from many low-energy states, and subsequently characterizing the shape of the energy landscape, QA models are capable of building the very probabilistic models of reality needed to rapidly solve binary optimization problems.

Quantum annealing, adept at solving optimization problems, drives probabilities toward the global minimum.

3. Accelerate autonomous vehicles with quantum AI

Firms in the automotive sector are currently running hours of video, image, LIDAR, and IR data through neural networks, with the ambition of leveraging AI to train self-driving vehicles to make safe and reliable driving decisions. However, doing so mandates the execution of a series of computationally intensive calculations — the difficulty and number of which increases exponentially in training AVs to operate functionally in mostly unrestricted environments.

Again, since quantum computers are capable of executing many complex calculations with multiple variables simultaneously, they are therefore well placed to exponentially accelerate the training of autonomous AI systems. It must be noted, though, that the use of quantum computing for computation of machine learning algorithms faces a major challenge: translating classical datasets to quantum ones — an obstacle yet to be surmounted.

“Quantum computers could have a welcomed computationally proficient slot in the cloud for the aiding of true self-driving cars” − Lance Eliot

Moreover, a September 2020 McKinsey & Company report has further spotlighted the additional value quantum computing could bring to stakeholders across the automotive value chain. From enhancing predictive maintenance software, to optimizing the routing of warehouse robots, quantum computing is set to solve an array of role-specific problems in the industry’s value chain (mostly in the form of hybrid solutions with HPC clusters).

A September 2020 McKinsey & Company report outlined QC applications across the automotive value chain.

4. Transform encryption with quantum cryptography

The vast majority of online-account passwords, secure electronic transactions and secure communications are protected through encryption algorithms, including Triple DES, RSA, and AES. Many such methods are based on the fact that large prime numbers can be multiplied with relative ease and speed, while on the other hand, it is extremely computer-intensive to do the reverse. The inability of classical computers to find the prime factorization of a known large prime number has ensured the viability of most encryption algorithms to date.

“As it so happens, [quantum computers] can solve the sort of these cryptographic problems upon which we built our cryptography in the 1980s exponentially faster than classical computers” − Vadim Lyubashevsky

However, quantum computers, equipped with the ability to parse complex data simultaneously, pose a significant threat to classical encryption systems. Indeed, Shor’s algorithm has demonstrated the efficiency of factorizing integers on an ideal quantum computer — although thousands, if not millions, of qubits are required to execute Shor’s algorithm (a flowchart of which is shown below), a feat not yet achieved, large-scale quantum computers in the not-so-distant future could defeat prevalent encryption techniques — this could potentially render even our most basic online services completely defenseless.

A flowchart showing an overview of Shor’s algorithm, as reconstructed from Craig Gidney’s Algassert article.

Fortunately, quantum-safe encryption technologies could mitigate this risk, by replacing the quantum-vulnerable mathematical problems used in public-key cryptographic algorithms with protocols involving certain components of quantum mechanics which are believed to be wholly intractable for both classical and quantum computers.

Quantum key distribution (QKD) is the only provably secure communication method because it uses physics — not math — to encrypt data” − QUANTUMXCHANGE

One leading cryptographic candidate is quantum key distribution (QKD), which operates through the transmission of millions of polarized photons over a fiber optic cable, each of which has a random quantum state. Upon arrival at the endpoint, the receiver guesses which of two beam splitters to “read” the polarization of each photon with, before transmitting information regarding the beam splitters used and the order of the photons back to the sender — this information is then compared with the sequence of polarizers initially used to send the photons, in order to discard the particles that were read using the wrong beam splitter. The resulting sequence of bits becomes a unique, randomly generated optical key.

This QUANTUMXCHANGE illustration depicts QKD’s mechanism for distributing & sharing secret keys.

Closing Thoughts

Buzz and hype aside, the potential for quantum computing to radically disrupt a wide range of industries is very real. Just as impressive as these future applications, though, are the fundamental principles governing quantum technologies; from superposition and entanglement to decoherence and architecture, it is only with a firm grasp of these challenging concepts can we realize scalable, transformative quantum computing systems.

“If you are confused by the underlying principles of quantum technology — you get it!” − Kevin Coleman

--

--

Alex K

17 y/o researcher in Machine Learning & Computational Biology.