Neither magical nor the fastest: myths and realities about quantum computers


Share post:

In October 2019, quantum computing dominated news headlines around the world for several days. A team of researchers from the tech giant Google had managed to achieve quantum supremacy, beating the largest supercomputers on the planet with a quantum computer. Not only that, but the time difference was simply staggering: a few minutes versus the thousands of years required to perform the same calculation on a traditional computer.

Dozens of articles and reports in the press, radio and television echoed this historic milestone and tried to explain to the non-specialized public what Google’s achievement really consisted of and what those mysterious quantum computers that had been used to achieve it were. Despite their good intentions, most of these explanations must have raised more questions than they managed to clarify.

No magic or fantastic superpowers

In popular articles on quantum computing it is common to find a series of recurring analogies and images that do not correspond to reality and that contribute to creating false myths around the true capabilities of quantum computers.

One of the most repeated is that “a quantum computer finds the solution to a problem by simultaneously testing all possible options.” This explanation does not overly simplify the operation of quantum computers. Rather, it seems to endow them with fantastic superpowers whereby completing any calculation is a matter of pressing a button and waiting a few seconds.

But then isn’t it true that a quantum computer uses massive parallelism to explore all the solutions to a problem at the same time? As in many things that have to do with the quantum world, the answer is both yes and no. It is true that one of the main properties on which quantum algorithms rely is superposition, that mysterious tendency of certain physical systems to find themselves in a combination of several different states. But that is only a part, and a rather small one, of the whole story.

We could define quantum computing as the discipline that studies the use of the properties of subatomic particles to perform calculations. Among these properties is, yes, overlap, but also entanglement and interference.

In a way, we could say that a quantum algorithm first creates a superposition of many possibilities to explore, then intertwines these possibilities with their results, and finally makes the bad solutions interfere with each other so that only those that interest us survive.

This phase of annihilating unfavorable options is the most difficult and delicate part of the whole process. It is a kind of complex mathematical choreography, to use the words of Scott Aaronson and Zach Weinersmith, that we only know how to carry out in some specific problems. Moreover, it has long been shown that in certain tasks it is not possible to take advantage of quantum computing to achieve faster calculations compared to traditional computers.

A quantum computer is not, therefore, that magical device capable of instantly solving any problem that the tabloid press sometimes wants to sell us. But neither is it simply a faster computer.

DAQ / Telos

not only faster

Another of the fallacies that is common to find in popular articles on quantum computers is the reduction of all their capabilities to a mere increase in speed. I’ve lost count of the number of times I’ve come across explanations like “scientists develop a quantum computer a million times faster than traditional computers”. As striking as these statements may be, they are totally wrong.

We are used to every few months when the big microchip manufacturers announce new developments that manage to be twenty, thirty or fifty percent faster than their predecessors. But a quantum computer does not base its operation on a simple advance in technology that allows the same operations to be carried out more quickly.

On the one hand, it is possible that for some tasks a quantum computer may not outperform a classical computer. But it is that in the cases in which a quantum computer offers an advantage over traditional devices, the differences cannot be measured with a single number.

A quantum computer runs algorithms radically different from those used by a classical computer. This makes the advantage of the quantum device grow the larger the size of the problem we want to solve. For example, for list search problems, a quantum computer will be five times faster than a traditional one with a hundred data points, fifty times faster with ten thousand items, and five hundred times faster with a million records.


It is precisely this increased advantage of quantum computers as the size of the data to be processed grows that makes them especially attractive when it comes to tackling problems that are intractable with traditional computers. This is the case of tasks such as finding the factors of very large integers, on whose difficulty the security of many of the encryption protocols used in our digital communications is based.

The time required to solve this problem using the best available classical algorithms grows almost exponentially with the length of the numbers, so increasing the size of a key by a few tens of bits would make it millions of times more secure. However, the mathematician Peter Shor showed more than twenty years ago that breaking this type of encryption would be feasible in practice if quantum algorithms were used.

Cryptography is not the only field where quantum computers can offer a huge advantage over traditional computing. For example, the simulation of new materials or the study of chemical compounds are two of the most promising applications of quantum computing. Once again, these are extremely difficult tasks for classical computers because the number of parameters that describe the behavior of physical and chemical systems grows exponentially with the number of particles that compose them. But the quantum properties of such systems make their simulation with quantum computers natural, as physicist Richard Feynman pointed out even before quantum computing existed as a scientific discipline.

Thus, many researchers in recent years have developed algorithms specifically designed to study the properties of chemical molecules using quantum computers. One of the most famous is the so-called Variational Quantum Eigensolver (VQE), which has the peculiarity of being able to be used even with the small and noise-sensitive quantum computers that we have today.

With this method, it has been possible to simulate hardware real quantum some small molecules, reaching a precision equivalent to that of classical calculations. Although we are still far from surpassing traditional computers in this task, the rate of growth of the capabilities of quantum computers and the improvements in the algorithms used make us suppose that this is possibly one of the first practical applications of the technology. .

DAQ / Telos

Quantum computing and artificial intelligence

Other fields in which research into quantum computing applications is currently particularly intense are artificial intelligence and optimization. Specifically, several quantum algorithms have been proposed to speed up the tasks involved in training models of machine learning from large collections of data.

In some cases, with techniques similar to those used by Shor in the development of his factorization algorithm, an exponential gain is achieved with respect to the corresponding classical method. However, since we must transfer the data to the quantum computer one by one from the files in which they are stored, the bottleneck would not be in the processing of information, but in reading it. Possible solutions would be the use of data captured directly with quantum sensors, which would avoid having to load them from an external device, and the development of quantum memories that allow data to be read in superposition.

In addition to the study of techniques to speed up classical machine learning processes, purely quantum models are also investigated, for example, so-called quantum neural networks. Since these proposals are relatively recent, their full capabilities are not yet known, but there is evidence showing that their performance is superior to that of classical methods with certain artificially created data sets.

As John Preskill, one of the greatest experts in quantum computing in the world, has rightly pointed out, in the same way that the applications of classical neural networks have been developed without the need to have, in all cases, a solid and exhaustive theory that As it stands, the increased availability of quantum computers on which to run and tune quantum neural networks will very likely lead to use cases that we cannot foresee today.

Quantum computers are not the solution to all computational and data processing problems that we may pose. They are not magical devices with which any calculation can be performed instantly. But they’re not just faster versions of the computers we have today, either. For tasks where it is possible to gain an advantage using quantum computing, the runtime gain increases as the size of the problem becomes larger.

If we take into account that the applications of quantum computers include fields of such relevance as cybersecurity, the simulation of physical and chemical processes or artificial intelligence, the fact that quantum computing is not a tool that works for everything does not diminish its value but simply qualifies it. Having quantum computers will not mean the end of our computational limitations, but we can be sure that it will mean a profound change in the way we calculate and process data and, therefore, a radical transformation of our society.

The original version of this article was published in number 119 of Fundación Telefónica’s Telos Magazine.

Related articles