Menu Close

Double or nothing: could quantum computing replace Moore’s Law?

Have we reached the limit of how small computer hardware can be? Mullenkedheim/Flickr

Ever noticed that computers become outdated remarkably quickly? It’s nice to have increasingly powerful computers available, and very profitable for the computer industry to have a new model available every couple of years. So how do they manage to make it happen?

Every new generation of computer hardware has roughly twice the processing power of the version two years before it. It’s a phenomenon known as Moore’s Law and it’s held true for nearly 50 years.

But could Moore’s Law be coming to an end? Could we be reaching the limit of how fast computer processors can actually be? And if so, what then?

End of an era

Moore’s Law states that the number of transistors that fit on a certain area on a computer chip doubles every two years.

In the past few years, it’s become clear that we’re reaching the limit of just how small, and just how powerful, we can make processors. As a result, developers are now looking towards radical design changes, using exotic materials, and applying plenty of creative thinking in the quest for solutions.

One of the fields attracting a lot of attention is the study of quantum behaviour of electrons and how this applies to computing.

A quantum future

Existing (or “classical”) computer hardware works by storing data in a binary format within transistors. The smallest piece of information – a “bit” – can have one of two states: “off” or “on”, “0” or “1”.

Quantum computing, on the other hand, allows us to use many physical systems (such as electrons, photons, or tiny magnets) as quantum bits, or “qubits”.

These qubits can be engineered to contain the same binary information as classical bits – i.e. “0” or “1” – but, that’s not all. Unlike any existing computer, one made of qubits can also encode an exponentially-larger amount of information than a simple binary state.

Let’s put this into perspective.

Fourteen bits in your computer’s central processing unit (CPU) can contain, well, 14 bits of binary information – 14 pieces of information which are either “0” or “1”.

Conversely, 14 qubits in a quantum computer can contain the equivalent of 214 bits of information. That’s 16,384 bits, far more than the 14 pieces of binary information possible in a classical system.

Let’s take it one step further and use 300 qubits as an example. Three hundred qubits is the equivalent of 2300 classical bits which is approximately the same as the number of particles in the entire universe.

Tangled up in blue

So how can quantum bits store so much more information than classical bits? Well, it’s all down to a phenomenon known as quantum entanglement.

A quantum particle is said to be “entangled” with another when its properties are only defined in relation to the other. Two entangled quantum particles could be physically separated, but if you observe them individually you will find correlations between them that cannot be accounted for by assuming they act independently of each other.

It may appear as if acting on one particle influences the other one instantly, even faster than the speed of light.

In reality, the entanglement makes the particles acquire “non-local” properties. No “action at a distance” is required, and the principles of relativity (i.e. no information can be transported faster than the speed of light) are respected.

Odd as this may sound, entangled particles create a distinguishable and legitimate state that can be used as a code to carry additional information without using additional bits.

The availability of these entangled states is the reason quantum bits can encode exponentially more information that classical ones.

There’s always a “but”…

While qubits can store an exponentially-greater amount of information than classical bits, quantum computing is still in its infancy.

In fact, at the moment, there are only a few examples where quantum computers can be used to complete tasks more effectively than classical hardware.These include:

  • The ability to decipher encrypted information much faster than is currently possible
  • The ability to search an unsorted database quickly and effectively.

The most advanced calculation done with quantum bits so far is the factoring of 15 = 3 × 5.

This may seem unimpressive, but it proves that quantum computing can be used in this capacity. With more research and more time, we’ll be able to factorise extremely large numbers – ones that are thousands of digits long – in a matter of minutes, rather than the millions of years it would take now.

Given these limitations, it’s not true to say that quantum computers will be able to replace existing computers. For one thing, the expected clock speed of a quantum computer is not likely to be any faster than that of a classical one.

Therefore, if we run the the same algorithm on a quantum and on a classical computer, the classical one will usually win. Quantum computers will only be better if an algorithm exists where the presence of entangled quantum states can be exploited to reduce the number of steps required in a calculation.

At this stage we don’t know of any quantum algorithm to reduce the complexity of, say, web browsing or text editing, but the search is on.

A quantum future, today

Regardless of how powerful and widespread quantum computers will be in decades to come, the basic research being undertaken to construct these machines is already very useful in the construction of classical systems.

One of the most promising uses for quantum computing today involves the use of single atoms coupled to silicon transistors. That is, the exact same components used in classical computers but scaled to single atoms.

In this way, many of the things we learn in the pursuit of a quantum computer can be reused for the purpose of pushing classical ones yet a step further in their miniaturisation.

Quantum computing won’t provide us with a replacement to classical computers if and when Moore’s Law grinds to a halt.

But it will help solve some interesting and challenging problems in computing.

Want to write?

Write an article and join a growing community of more than 180,400 academics and researchers from 4,911 institutions.

Register now