**IBM has recently ****announced its plans**** to begin commercial sales of quantum computers in the next three to five years. As the company’s VP Norishige Morimoto explained, this will happen once the US tech giant’s quantum computers surpass the computational capacities of current-day supercomputers. How realistic are these plans? What challenges do researchers still face in the development of quantum computers? Scientists from Russia and abroad discussed the latest trends in quantum technology during ITMO University’s seminar Quantum Machine Learning. Published below are the key highlights of a lecture by Aleksandr Kirienko, who came to ITMO as part of the ****ITMO Fellowship**** program.**

**Why do we need quantum computers?**

Quantum computing has existed as a field for quite some time, but only in recent years have people begun to understand the nature of quantum computers and their potential. To explain what makes a computer “quantum”, let’s start with the regular computer, which conducts operations using bits (zeroes and ones) and stores them using bitstreams. Once we attach logical operations to a bitstream, we end up with a function that operates in bits.

In computer science, we have the concept of a “universal Turing machine”. It proves that any algorithm has a finite number of operations and can produce any calculations in a particular class of universality, which means that, theoretically, any type of operation can be carried out successfully on different kinds of equipment.

If we were to increase the number of operations, we might end up with something quite simple, like something akin to the first laptop, and then move on to something more complex, like a supercomputer, which exist today and take up entire rooms. It would seem that it would be enough to simply increase the number of transistors on a plate. Moore’s Law, which predicted that the number of transistors per circuit will continue to double has proven true so far, and served as the proof that computers can develop very rapidly.

But at one point, in addition to the size of a system, people began to wonder about the kind of productivity that depends on our ability to effectively draw off heat. And it turned out that the number of elements per circuit does matter: we can’t place too many of them together.

So what happens once classic hardware can develop no further? For us, one option is to invest in quantum computers, which utilize a completely different method.

**What’s the challenge?**

If we wanted to simulate a quantum computer, we wouldn’t be able to do it the regular way. Back in the day, **Richard Feynman** had taken note that modeling even the simplest physical systems on a computer requires an incredible amount of computational resources. What he suggested was using the principles of quantum mechanics to accelerate calculations. It seemed like a crazy idea at the time, but once it was eventually formalized, we learned that quantum computers have a completely different kind of capacity than regular computers.

As for how quantum mechanics actually work, we could describe them as a wave function, a massive vector affected by an evolution operator. In this case, it is an exponential of a Hamiltonian matrix (in quantum theory, a hamiltonian (H) is an operator of all energies of a system). If we multiply these matrices by a classic vector, we would require an exponentially higher amount of elements in the system. At the same time, if we were to do it quantum-mechanically, the process would occur naturally. It should also be noted that, unlike classic bits, quantum states cannot be copied because an evolution operator is a unitary operator.

In 1994, **Peter Shor** shown that quantum algorithms can find prime factors at an exponentially higher pace than classic algorithms. It would seem that even with that one algorithm we already have enough proof of the need to create quantum computers. Nevertheless, before creating more of these algorithms we need to overcome a few other obstacles.

First off, quantum data doesn’t live forever. It is highly dependent on how we interact with our environment. Any interaction that creates noise in a quantum system kills the data, and therefore must be prevented. When we make changes, the wave function implodes and the calculation must be done from the start. In a sense, quantum computers work like a probabilistic machine. As I’ve already noted, we also can’t copy data, which makes it more difficult to avoid errors.

From a theoretical standpoint, many of these questions have already been solved in the ‘90s, but then haven’t we yet seen a proper quantum computer? The answer is that, despite the existence of Shor’s algorithm, we need a lot of resources to utilize it. Ones we simply don’t have today.

All the existing algorithms of a similar kind imply massive acceleration. It begs the question: can this acceleration be achieved with the computers that exist today or will be created in the future? Many scientists, including **Matthias Troyer** who now works at Microsoft, have tried to find the answer. He checked every algorithm by asking the question: at which point in the future (in, say, ten years) could we be able to use one of these algorithms?

His research showed that this would be a non-trivial task even for quantum computers of the future because even a massive quantum computer – made up of 10^{8} qubits – would require 10^{29} operations. Even by trying to calculate an elementary operation that lasts dozens of nanoseconds, we’d end up with years of calculations.

**Quantum chemistry: what is it and what tasks does it solve?**

And thus we face another question: is there anything exciting we can do right now? According to scientists, there is indeed if we take a system that’s already close enough to a quantum one. Such as a molecule. By perceiving a hamiltonian as the interaction of a core and electrons, we can present it as a quantum mechanics problem for a large number of particles. That is a complex model that can nevertheless be written down as an interaction between qubits. In the future, we may try to simulate that hamiltonian on a quantum computer.

Why do we need this? It would let us, for example, understand the processes that occur inside molecules. Take the Haber-Bosch process, which we use to produce nitrogen at 2,000 degrees Celsius. But there is also a biological process that allows us to do the same at regular temperature and in normal conditions. But we’ve yet to learn how to use it because the molecules involved in the process are far too complex. If we could take a quantum computer and apply even the algorithms that exist today, we could decode the basic excited state of a quantum system, predict the reaction speed, and suggest new catalysts. This way we’d reduce the use of natural gas by two to five percent, which is actually an enormous amount of money.

**Quantum devices**

This is the field that deserves credit for allowing us to speak of quantum computers as something real. Even ten years ago, a report like this would only raise academic interest; after all, quantum computers were just theory. Nowadays, it’s not just a theory or a field, but a full-fledged industry. Starting in 2014, some of the world’s top companies have presented their quantum computer prototypes. Google’s computer contains 72 qubits, and Intel and IBM’s devices contain 49 and 50, respectively. When I first joined this field, quantum computers were only made up of 2 or maybe 4-5 qubits. What we’re seeing now are truly massive systems.

It must be said that the size of a system is not its only criterion. Lots of teams can print a circuit, but you must also make it all work. No company so far has presented an experiment that would entangle all the qubits. Nevertheless, we’re seeing progress. Apart from the industry, there is also academia. Teams from Harvard University have showcased a 51-qubit simulation.

In addition to the giants, there are also plenty of startups, the number of which has grown significantly over the recent years. One such startup is Rigetti Computing, founded four years ago and now employing some 100 staff. Rigetti Computing promises to create 128 qubits before the end of 2019. There are many other, smaller companies that also have great ambitions. The ecosystem is growing rapidly. And if we want to create a quantum computer and remain in the competition, we have to do it now.

Today’s humble systems can be used to solve relatively simple quantum chemistry tasks and improve optimization; in the future, we’ll try to achieve the same with larger systems. There is a lot of ground to cover between our dream and what we have right now. This is what **John Preskill** referred to in 2018 as the era of NISQ (Noisy Intermediate Scale Quantum) devices. These are the systems that are large enough to show off some non-trivial physics, but are still limited in application. This is the field where most research is being done right now.

**Quantum programming**

I do not believe this subject has gotten the attention it deserves, but with time it might become a key topic. We must keep in mind that quantum software is not any single piece of software, but an entire ecosystem made to service a quantum computer. For instance, we need to have a control system, as well as various programming languages depending on the task at hand.

In classic programming, we have high-class languages like C++ and C#, but with quantum computing we have new languages such as Microsoft’s Q# and others. The companies that currently dominate the computer market can be expected to soon dominate the quantum computing market.

**Quantum Machine Learning seminar: key objectives**

The interdisciplinary seminar Quantum Machine Learning brought together experts and young researchers from ITMO University and other universities around the world. As the seminar’s organizers noted, its main goal was to provide a platform for the discussion of key tasks, methods and approaches to the new field at the intersection of quantum technologies, machine learning, and AI.

“Quantum computing is like football: everyone knows something about it and has something to say, but there’s a lack of deeper understanding. Firstly because it’s too complex a subject, and secondly because new knowledge is being added constantly. Due to these reasons, it is important that we explore the current landscape and understand the stage at which quantum computing research is right now. We also invited some of the specialists who work on this subject at ITMO University. It allows us to learn more about the research being done at the University and, perhaps, consolidate our research,”saidIvan Iorsh, the head of ITMO University’s International Laboratory of Light-matter Coupling in Nanostructures.

.(tagsToTranslate)Quantum Computing: Challenges(t)Prospects(t)and Current Developments

This is a syndicated post. Read the original post at Source link .