## Quantum of what?

Physicists have been wrestling for more than a century with quantum mechanics, a set of rules that govern the behaviour of atoms and other small systems – small compared to a human being. As a matter of fact, even what quantum mechanics *is about* is not so obvious: first, because if everything is made up of small constituents, then *everything* should be subject to the rules of quantum mechanics. Second, there are large systems such as superconductors or superfluids that are large, but still undeniably exhibiting quantum behaviours. In the end, one might ask:

What is a quantum system? Anything that obeys the laws of quantum mechanics.

To further complicate the matter, there is not even a real agreement on what those rules are exactly, not to say where they come from in the first place. There are, however, a number of features that are unmistakably quantum, *i.e.* never found among the properties of everyday (“classical”) objects:

- The possible states of a quantum system are described by a mathematical object called
**wavefunction**. The**Schrödinger equation**determines how the wavefunction changes in space and time; - Before a measurement is performed, the states are said to be in
**superposition**; - When a measurement is performed, a
**single state**is selected out of all possibile ones; - Each state is associated to a
**probability**of being the outcome of a measurement, which equals to the squared value of the piece of the wavefunction corresponding to such state (**Born rule**); - The complete state of the wavefunction can not be copied (no-clone theorem).

One consequence of Schrödinger equation, when applied to systems of pairs of particles, is the phenomenon called **entanglement**, considered the telltale of quantum systems. Other fascinating yet incomprehensible features are tunneling (particles flowing through a barrier) and the no-clone theorem.

As a matter of fact, no one knows for sure why does the Born rule exist at all. Even more puzzling is the origin of entanglement: a possible explanation is that the information concerning fundamental objects is not associated with the objects themselves, but “spread” in the whole system, which includes the particles subject to the observation, the measurement apparatus, the environment and the observer. It’s an intricate, delicate ballet. Two key words have to be remembered: *causality* and *locality*. Causality means that *if an event A causes another event B, then B must always follow A*. Instead, locality means that information can not be propagated faster than the speed of light in vacuum *c*, as forbidden by the theory of special relativity, which prevents the so-called *action at distance*. The two definitions seem very similar, but in fact they are not. Locality is more basic, as causation seems to be a consequence of an exchange of some form of information.

At the heart of the matter is what happens when after two particles have interacted with each other, “fly away” and are subsequently subject to a measurement.

Let’s say that each particle has a property (*e.g.* *spin,* *polarization*, etc.) that can take upon a binary value, 0 or 1, or 1 or -1. In symbols, we would write |0> or |1>. A system composed of the two particles is denoted as a sum over all possible states: |Ψ> = a|00> + b|01> + c|10> + d|01> . However, the aforementioned interaction introduces a correlation between properties of the two particles, forcing the system to be in a superposition of only a subset of opposite states, *e.g.* |Ψ> = a|01> + b|10>. Before the measurement, the system is in a superposition of states. When a measurement is performed on particle |a>, it not only reveals its state as being either |0> or |1>, but at the same time gives also information about particle |b>. That is because the measurement singles out one of the two combinations of states, i.e. either |01> or |10>, with probability determined by the Born rule. Experiments demonstrate that this happens instantaneously, but also indicate that there is no exchange of information, as we can not *force* the state of particle |a> to take a certain value, but only *observe* it, whatever it appears to be, *at the time of the experiment*.

So, do the particles have a pre-determined state (|01> or |10>) that is revealed by the experiment, or are their values *created* by the measurement itself? If the former is true, which is the mechanism that determines their state? If the latter is true, how does particle |b> get to know about the status of particle |a> withouth violating special relativity? What if *infomation* about particle |b> is also contained inside particle|a> and the enviroment (observer included)? It really looks like a paradox, we do not know for sure. Here’s where physicist disagree, sometimes even quite strongly. Albert Einstein, who was a staunch realist, never accepted the Copenhagen interpretation and famously coined the derogatory expression “spooky action at distance”.

## The meaning of QM

The situation is so confusing to the experts that several **interpretations** of the theory have blossomed in the past century. It’s not the first time we face a situation like this: it already happened with thermodynamics and with the propagation of light. The most popular interpretations of QM are called Copenhagen (considered the “quantum orthodoxy”), Bohm-De Broglie, Many-World and QBism, but the list does not end here. The main divide is given by a simple question: *what is QM all about*? Is it a description of reality (*realism*) or of what we *know* about reality? Thanks to the work of John Stewart Bell and others who followed, experimental physcists had the conceptual tools to test the assumptions made by some of these interpretations. The conclusion so far seems to be that QM is either *real but non-local* (Bohm-De Broglie), or non-real (Copenhagen) but local. QBsim insists that it does not really matter if there is a reality or not, QM is just a theory of our knowledge. One of them still escapes any chance to be put to a test: in the Many-World interpretation (MWI), the universe as a whole is in a superposition of states, and it branches every time a measurement is performed (you can even install on your mobile phone an app that splits the universe in two just by pressing a button for just $1.99). The two copies of the Universe are identical, except for one single bit of information, and will diverge forever since. Let me be clear: each interpretation has pros and cons, and none is evidently better than the others.

I have my personal preference for the Bohm-De Broglie interpretation, but I must say that it is mostly because it agrees with some pre-conceptions I have rather than because it is more credible than others, and I completely dislike MWI for its shaky definition of probability. Also, as I will make clearer later on, I tend to belive that QM is more a theory of **information** rather than of physical systems themselves. In fact, some people belive that the list of properties I listed above is just a *historical* accident due to the way QM was discovered by human beings, and that the theory has to be reconstructed from more fundamental axioms.

## Tapping onto the bedrock of reality

All of the above might seem very fascinating, but a question remains: is there any use of these strange features? Indeed, there are really uncountable applications. The most remarkable one, for our society, is probably tunneling: under certain circumstances, a particle can go through a barrier which would be insuperable in the classical real (think of banging your head against a wall: in the quantum real, sooner or later you would end up in the room next door). Most of consumer electronics is based on this effect, in the form of diodes and transistors. However, it still looks like we are only scratching the surface of what’s possible.

If there’s something that the history of science and technology tells us is that we learn about the things we discover when we play around with them. A formidable example is thermodynamics: its original formulation dates back to the late 1700s, but while people were still debating in the early 1900s about the existence of atoms (“*science advances one funeral at a time*” as Max Planck once said), engineers developed and improved the steam engine and other contraptions long before the matters was settled. Regardless of the interpretation of the laws of thermodynamics, humans were able to harness its power and deploy its features to create machines that changed the face of society forever.

Physicist Seth Lloyd of MIT once said that, as a matter of fact, the Universe is computing something. Not in the sense that it was *designed to be* a computer, but it *behaves* like one. Let me give an example: 2 + 4 = 6. If I ask you now “I know x+y=6, what are the values of x and y?”. There is no definte answer: (1,5), (2,4),(3,3) are all equally probable. This means that the operation of addition reduced the amount of information in the Universe: we can’t go back and retrieve *x* and *y*. Similarly, once a measurement is performed, all but one among the possible states of the superposition are lost forever. Thus, performing a measurement on a quantum system reduces the information and is equivalent to an operation. You know where I’m going: we can use quantum states to perform calculations. That’s the essence of *quantum computing*. As a matter of fact, “pure” quantum computing replaces the familiar unit of information (called “bit”) with the quantum equivalent, called “**qbit**“. A qbit is, by definition, a superposition of |0> and |1> until we observe its value. Two qbits can be entangled and manipulated via **quantum gates**, *i.e.* the analog of boolean logic gates (NOT, AND, OR, XOR, etc..) and common electronics. The beauty of it is that quantum systems are able to do things that are *impossible* (not hard: *impossible*) for classical systems, allowing us to devise **algorithms** that are specific to the quantum realm. Before we go on, let’s be clear about one point: there can be computational devices that are based on quantum phenomena such as annealing (more later), or that are based on the manipulation of qbits via quantum gates. The latter are usually referred to as *Universal Quantum Computer* or Quantum Turing Machine (QTM).

In quantum computing, a number of basic gates constitue the foundation of more complex algorithm. Those are:

- X gate: flips the qbit:
- X|0> = |1>
- X|1> = |0>

- Hadamard gate: creates entanglement between two qbits:
- H|0> = (|0>+|1>)/√2 = |+>
- H|1> = (|0>-|1>)/√2 = |->

- Phase gate: rotates the phase of the wavefunction:
- U(0)|0> = exp(iθ)|0>
- U(0)|1> = |1>
- U(1)|0> = |0>
- U(1)|1> = exp(iθ)|1>

- Controlled-NOT (CNOT) gate: flips qbit |a> only if qbit |b> =|1>:
- C|00> = |00>
- C|01> = |11>
- C|10> = |10>
- C|11> = |01>

Under some circumstances, algorithms that involve more that 2 qbits can be decomposed in simpler, binary operations that are a combination of the ones listed above (tensor networks).

## Quantum computers

Why do we want quantum computers? Because they can solve *hard problems* (but not necessarily NP-hard problems).

There are two main paths of research: one concerns the development of algorithms, while the other is about the construction of hardware that is able to run such algorithms. Both areas are progressing very fast, with innovations (and some breakthrough) happening at an exponential pace.

First, let’s talk about the hardware. There two classes of devices: one is based on the quantum equivalent of annealing, the others deploy other phenomena to deal with the status of qbits.

Quantum annealing is relatively easy to understand. An array of atoms can be set in a certain state, and then let evolve towards the energy minimum. Imagine a landscale of valleys and peaks. The goal is to find the deepest valley. What is unique of quantum systems is the tunneling, *i.e.* the system can settle to a minimum without having to overcome the energy barriers (the peaks in our metaphor). This method is especially powerful for optimization problems, that are obiquitous. The catch is that only problems that can be formulated in terms of a certain function called Hamiltonian of the Ising model can be currently accomodated on machines of this kind. As a matter of fact, the procedure is repeated a large number of times, and the minimum of all the minima that have been found is taken as the best guess of the real global minimum, with no guarantee that it can always be found, provided there is one in the first place. Some people feel like this is a bit of cheating, as machines like this are not universal quantum computers and do not allow the implementation of quantum algorithms. The CEO of DWave Systems, the company that first introduced this method, thinks otherwise.

The other big class of quantum computers are based on Noisy Intermediate-Scale Quantum (NISQ) technology. That these machines are still not “fully quantum”, but already provide the user with a few tens of qbits (usually up to about 50, with the recent introduction of a 128 qbit device by Rigetti at the time of writing). Unfortunately, the noise due to thermal excitation limits the number of circuits that can be executed. One thing is certain: a 100-qbits computer is not going to change the world overnight. As for the qbits themselves, they are implemented in different ways, such as:

- Superconductuing loops
- Trapped ions
- Silicon quantum dots
- Topological qbits
- Diamond vacancies

Different companies have decided to use different technologies depending on feasibility, reliability, noise tolerance, cost and other aspects. For example, IBM and Rigetti use superconducting loops, Intel deploys quantum dots, and Microsoft decided to make use of topological qbits.

Takeaway message #1Quantum computers, thanks to unique features of QM, can solve problems that are eithernot possibleorpractically unfeasibleon classical systems

The first potential and shocking application found was Shor’s algorithm to decompose an integer number in its factors. Most commong cryptographic algorithms are based on the *practical* impossibility to factorze a number with 2,000 digits, which means that it will take the age of the universe to try out all possible combinations, until a solution is found. On the other hand, quantum systems behave differently and offer other routes to get to the answer that are not possible with classical algorithms, such as the one used to do this trick in Shor’s algorithm. The complexity of any algorithm can be expressed in terms of the number of operations needed to carry out the calculation. For example, scanning an array to find the maximum value requires *N* operations, where *N* is the number of elements. The quantum magic performed by the so called Gorver’s algorithm lowers the number of operations needed to √*N*, which is a huge speedup when *N* is very large. Other algorithms may require *N*^2, *N*^3,… operations on a classical computer, but for instance *N*log*N*, *N^2*log*N *on a quantum computer. This feature is called *exponential speedup*. To know more about the quantum algorithms invented so far, take a lookg at the Quantum Algorithm Zoo.

Takeaway message #2Quantum computers can also solve problemsmuch fasterthanks to the exponential speedup

## Quantum computing at CERN

Often, it’s *big challenges* that push technology to the edge. The most famous example is certainly the quest to land a human being on the Moon and bring him back safely on Earth. It took less than a decade of efforts to make it happen. What’s happening now is that Google is literally challenging big institutions (NASA and CERN fit the bill) to demonstrate what they refer to as *the quantum supremacy*. Literally, they want to demonstrate that quantum systems can outperform classical supercomputers, at least in some well-defined areas. Not everyone agrees, of course, the main objection being that quantum devices are (currently) very susceptible of errors due to thermal fluctuations compared to classical machines, an issue that might never be completely overcome.

In November 2018, CERN Openlab hosted a workshop to discuss the possibility to deploy quantum computing technology in high-energy physics. This event offered an opportunity to people working in academia and in the private sector to get together and go through the details of what the future will look like – where “future” usually means 10 years from now or so, which is roughly the timescale of the High-Luminosity LHC upgrade (HL-LHC). Needless to say, major breakthroughs can hardly be predicted and in fact may happen any time. The main applications seem at the moment application to optimization problems, training quantum neural networks (J.-R. Vlimant, W. Guan), track-finding and detector emulation (H. Gray, T. Boccali) and QFT calculations (A. Macridin, S. Jordan ).

## Conclusions

We are witnessing the advent of quantum computers. While its impact is nearly non existent at present, there is no reason to ignore these technology on the long run. Any organization with ongoing projects with a timescale longer in the order of a decade or more should start paying attention before it’s too late and avoid making the same errors of the past.

“

I think there can be a world market for maybe five computers” – Thomas Watson, CEO of IBM, 1943“

There is no reason for an individual to have a computer at home.” – Ken Olsen, founder of DEC/Compaq, 1977“

I think that this thing that Tim Berners-Lee has shown me has no future” – Federico Carminati, Computing project leader at CERN, 1989

My take on that is that one day, but not too far in the future, we’ll make use of hybrid systems routinely, where most of the workload is still done by ordinary CPUs, but dedicated tasks are carried out by either low-power massively parallel systems (GPUs) or quantum systems (QPU). Stay tuned, one day you’ll be able to buy a quantum device on Amazon!