Quantum computers promise to deliver enormous computational power and solve problems that cannot be tackled by ordinary (classical) machines. There are many hardware platforms on which quantum computing can be developed, and it is not yet clear which technology, or combination of technologies, will prove most successful. Today, the leading schemes are based on superconducting electrical circuits or trapped-ion technologies. Another approach, based on photonics, has often been considered impractical because of difficulties in generating the required quantum states, or transformations of such states, on demand. However, this method could turn out to be the dark horse of quantum computing. Writing in *Nature*, Arrazola *et al.*^{1} report the development of a programmable and scalable photonic circuit, and demonstrate three types of quantum algorithm on this platform.

According to quantum theory, there is an inevitable uncertainty associated with the amplitude and phase of any state of light (the phase specifies in which stage of an oscillation cycle the light wave is). If this quantum uncertainty is unequally distributed between the amplitude and phase, the state is said to be squeezed; and the more the state is squeezed, the more photons it contains. Multi-photon squeezed light is found in many quantum-optics experiments, and quantum-computing models based on these states have existed for more than two decades^{2}^{,}^{3}. However, whether computers based on such models would be practical has been justifiably questioned, because of the quantum uncertainty.

This scepticism has disappeared in the past few years. It became clear that a relatively simple optical circuit, based solely on squeezed light, beam splitters (devices that split beams of light in two) and photon counters, could carry out a sampling algorithm (a procedure that takes a random sample of data) at a speed beyond the reach of classical computers^{4}. It was also discovered that such an algorithm has many practical applications^{5}. For example, it is useful in simulating transitions between states of molecules^{6} and finding matching configurations of two molecules — a process known as molecular docking^{7}.

In the computing architecture used to implement this quantum sampling algorithm, squeezed states of light are generated and launched into an optical network consisting of several optical paths and beam splitters (Fig. 1). The squeezed states mix together when they meet in beam splitters because of a quantum effect called interference. As a result, all the states come out completely scrambled, in a way that depends on the relative lengths of the optical paths, known as their relative phases. Reprogramming these phases alters the type of scrambling. After scrambling, the number of photons in each output of this quantum circuit is counted using highly sensitive detectors.

The measurement outcome provides a specific sample of data from the quantum experiment. For a classical computer, the time needed to take such samples scales exponentially with the number of input squeezed states (amounting to billions of years when this number is high). By contrast, the quantum circuit can produce a sample in fraction of a second, demonstrating what is called a quantum advantage.

Arrazola *et al.* implemented their photonic circuit on a silicon nitride chip that is compatible with the fabrication processes used by the semiconductor industry. The authors produced a squeezed state in each of four micrometre-sized devices known as optical ring resonators on the chip using an effect called four-wave mixing. They achieved light propagation and interference by carefully etching tiny structures known as optical waveguides on the chip. The network of beam splitters was fully controllable and was made fully reprogrammable for a remote user through the cloud. The output of the network was then directed to four photon-counting detectors, and these detectors generated the samples that were sent to the remote user.

The authors executed different types of measurement to characterize the quality of the squeezed-light sources and the overall performance of the chip. First, they measured the uncertainty suppression of the squeezed states relative to ordinary states to be about 84%. Second, they measured the temporal purity of the states (a property that is crucial for successful interference in the network) to be 85%. Third, they carefully tested the quality of the interference. And finally, they verified that the samples generated had a genuinely quantum nature by testing them against a criterion for non-classicality — a necessary condition if the device, when scaled up, is to produce samples that are impossible to simulate using a classical computer.

In addition to the sampling algorithm used to demonstrate a quantum advantage, Arrazola and colleagues implemented two algorithms of greater practical relevance: one that determines energy spectra for transitions between molecular states, and another that finds the similarity between mathematical graphs that represent different molecules. The authors achieved this feat by encoding the specific problem into the squeezed states and beam-splitter network, and then using the samples generated to estimate the molecular spectra or classify the graphs.

Quantum sampling based on squeezed states has been demonstrated by other research groups^{8}^{–}^{10}. In particular, one group last year ran the sampling algorithm on 50 squeezed states in 100 optical paths, and reported a quantum advantage^{10}. The researchers estimated that it would take 600 million years to simulate such an experiment on a supercomputer. However, these demonstrations were not scalable because of the bulkiness of the set-up^{8}^{,}^{10} or owing to photon losses^{9}. Moreover, the circuitry of these previous experiments was not reconfigurable, and therefore only a single, random algorithm could be executed. By stark contrast, Arrazola and colleagues’ circuitry is programmable and potentially highly scalable.

Nevertheless, there are still some hurdles to overcome before the quantum-sampling algorithm can reach its full potential and become useful for real-world applications. For instance, the quality of the squeezed states must be markedly improved, and for some applications, the degree of squeezing and the amount of optical power in each squeezed state must be individually controlled. Moreover, to scale up the system, photon losses need to be decreased; otherwise, the photons will not survive their journey through the circuitry.

Without doubt, the authors’ demonstration of quantum sampling on a programmable photonic chip using highly squeezed states is remarkable and represents a milestone in this field. However, the number of commercial applications that can be implemented using the current architecture is limited. Completely different platforms are required to run heftier algorithms, such as Shor’s algorithm for factoring large numbers into prime numbers^{11}, in an error-free manner. Fortunately, such platforms (also based on squeezed states) have been proposed^{12}^{,}^{13}, and their implementation constitutes the next step towards constructing a full-blown optical quantum computer.

This is a syndicated post. Read the original post at Source link .