Every once in a while, it’s time for an experiment. Today? I removed the images from the post because some email apps do not display images, anyway. What do you think?
Please vote at the end of the post. Alternatively, you can read the post on Medium (with images)
Quantum computing is a different form of computation. A form that can change the complexity of solving problems making them tractable. But this different form of computation brings its own challenges. Digital computers need to distinguish between two states: 0 and 1.
The circuits need to tell the difference between high voltage and low voltage. Whenever there is a high voltage, it is 1, and if there is a lower voltage, it is 0. This discretization means that errors must be relatively large to be noticeable and methods for detecting and correcting such errors can then be implemented.
Unlike digital computers, quantum computers need to be very precise. They keep a continuous quantum state. And quantum algorithms need precise manipulations of continuously varying parameters.
In quantum computers, however, errors can be arbitrarily small and impossible to detect, but still, their effects can build up to ruin a computation.
This fragile quantum state is very vulnerable to the noise coming from the environment around the quantum bit. Noise can arise from control electronics, heat, or impurities in the quantum computer’s material itself and can also cause serious computing errors that may be difficult to correct.
But to keep the promises quantum computers make, we need fault-tolerant devices. We need devices to compute Shor’s algorithm for factoring. We need devices to execute all the other algorithms that have been developed in theory that solve problems intractable for digital computers.
But such devices require millions of quantum bits. This overhead is required for error correction since most of these sophisticated algorithms are extremely sensitive to noise.
Current quantum computers have up to 65 quantum bits. Even though IBM strives for a 1000-quantum bits computer by 2023, the quantum processors we expect in the near term will have up to 100 quantum bits. Even if they exceed these numbers, they remain relatively small and noisy. These computers can only execute short programs since the longer the program is, the more noise-related output errors will occur.
Nevertheless, programs that run on devices beyond 50 quantum bits become extremely difficult to simulate on classical computers already. These devices can do things infeasible for a classical computer.
And this is the era we’re about to enter. The era when we can build quantum computers that, while not being fault-tolerant, can do things classical computers can’t. The era is described by the term “Noisy Intermediate-Scale Quantum” — NISQ.
Noisy because we don’t have enough qubits to spare for error correction. And “Intermediate-Scale” because the number of quantum bits is too small to compute sophisticated quantum algorithms but large enough to show quantum advantage or even supremacy.
The best quantum computers we have today are NISQ-devices. These devices require a different set of algorithms, tools, and strategies.
For instance, Variational Quantum-Classical Algorithms have become a popular way to think about quantum algorithms for near-term quantum devices. In these algorithms, classical computers perform the overall machine learning task on information they acquire from running certain hard-to-compute calculations on a quantum computer.
The overall algorithm consists of a closed-loop between the classical and quantum components. It has three parts:
Pre-processing
PQC
Post-processing
The quantum algorithm produces information based on a set of parameters provided by the classical algorithm. Therefore, they are called Parameterized Quantum Circuits (PQCs).
These PQCs are relatively small, short-lived, and thus suited for NISQ-devices.
So, what do you think?