Quantum computing (QC) leverages quantum mechanics to enable a vastly different mode of computation than computers based on classical physics, including conventional von Neumann systems. A quantum bit (qubit), like a classical bit, takes a binary 0 or 1 value when measured, usually at the end of a quantum computation. However, the value of a qubit is not deterministic. A quantum state of n interacting qubits is parameterized by 2n complex numbers, which are called amplitudes and cannot be accessed directly; measuring such a state produces a single random n-bit classical string with probability dictated by a corresponding amplitude.
A powerful feature of quantum computation is that manipulating n qubits allows users to sample from an exponentially larger probability distribution over 2n outcomes. However, an analogous claim can be made for randomized classical algorithms operating on n probabilistic bits (e.g., flipping n coins). A key difference between the two is that quantum algorithms seem to be able to sample from certain kinds of probability distributions that may take exponentially longer for randomized classical algorithms to mimic.
For example, Shor’s seminal 25-year-old quantum algorithm for factoring integers requires exponentially fewer steps than the best-known classical counterparts. Exponential quantum advantages are also known for other fundamental scientific problems such as solving a certain kind of linear systems of equations and simulating quantum-mechanical systems, currently a critical bottleneck in many physical and chemical applications. The precise source of quantum computational advantage is not well understood; however, it is attributed in part to quantum computation’s ability to efficiently generate entanglement among qubits, yielding probability distributions with correlations that in some cases overstep the reach of efficient classical algorithms.
Successes in designing theoretical quantum algorithms have fueled the hope that other quantum advantages can be discovered and exploited. Ideal quantum advantages would provide: (i) an exponential (or at least super-polynomial) computational speedup, (ii) practical applications, and (iii) implementation on a physically realizable quantum system (ideally scalable). A foremost open question in quantum computing is whether all three of these can be simultaneously achieved. A significant hurdle for (iii) is that prepared quantum states are fragile and highly susceptible to environmental noise and rapid entropic decay. Contemporary quantum information science (QIS) research addresses (i) and (ii) by developing novel quantum algorithms and applications and (iii) through scientific and engineering efforts to develop noise-resilient and scalable quantum infrastructure.
After decades of steady progress, mainly in academia, the past five years have seen an explosion of interest and effort in QIS. The fifteen years of QC research at Sandia spans the Labs’ expertise from theoretical computer science and physics to microelectronic fabrication, laboratory demonstrations, and systems engineering. Hardware platforms developed at Sandia include a variety of efforts in trapped-ion, neutral atom, and semiconductor spin qubits. Complementary theoretical efforts have created unique capabilities, from quantum characterization verification and validation protocols to multi-scale qubit device modeling tools. Even efforts that are ostensibly purely theoretical, such as quantum algorithms development, are tied to applications of interest ranging from optimization and machine learning to materials simulation The breadth of current Sandia research activities coupled with the longevity of Sandia’s program have established Sandia as a leading U.S. National Laboratory in QC and broader QIS research.
Most recently, Sandia has been successful in securing a number of quantum computing projects funded by the recent push from DOE Office of Science and the National Nuclear Security Administration. Among these projects, closest to the hardware, are the Advanced Scientific Computing Research (ASCR)-funded Quantum Scientific Open User Testbed (QSCOUT) and Quantum Performance Assessment (QPerformance) projects. In just over a year, the first edition of the QSCOUT testbed with three trapped-ion qubits was stood up. While this will be increased to thirty-two qubits in time, the testbed is most significant for providing researchers complete access to generation of the control signals that specify how gates are operated so they can further investigate the quantum computer itself. A critical component of this effort is the Sandia-developed Jaqal quantum assembly language which will be used to specify programs executed on QSCOUT. The QPerformance project is aimed at creating techniques for evaluating every aspect of a testbed QC’s performance and understanding and tracking how these change with improvements to the QC hardware and software. The effort isn’t limited to the QSCOUT testbed and it will invent and deploy platform-independent holistic benchmarks that will capture high-level characteristics that will be predictive in evaluating the suitability of QC platforms for DOE mission-relevant applications.
At the next level of the computing hierarchy sits the ASCR-funded “Optimization, verification and engineered reliability of quantum computers” (OVER-QC). Led by Sandia, this project aims to develop tools that get the most out of near-term QC hardware, which will be noisy and imperfect. By developing specialized techniques to interpret the output, and to increase the reliability of such noisy hardware, OVER-QC aims to understand and push the limits of QC hardware.
Sandia is poised to be a leader in the fields of QIS and QC research, while integrating capabilities across the whole QC stack.
Sandia complements these efforts driven by near-term QC hardware with ASCR-funded efforts focusing on developing fundamental hardware-agnostic quantum algorithms for future fault-tolerant quantum computers. These Sandia-led projects, “Quantum Optimization and Learning and Simulation” (QOALAS) and “Fundamental Algorithmic Research for Quantum Computing” (FAR-QC), are multi-institutional interdisciplinary efforts leveraging world-class computer science, physics, and applied mathematics expertise at Sandia and more than ten partner institutions. QOALAS seeks to develop novel quantum algorithms enabling new applications in optimization, machine learning, and quantum simulation. FAR-QC expands upon the scope of QOALAS to identify problems and domains in which quantum resources may offer significant advantages over classical counterparts. Some of the achievements of these projects include new quantum algorithms offering significant advantages for solving linear systems, convex optimization, machine learning kernels, and rigorous simulation of physical systems.
Among the key mission priorities of Sandia are those related to stockpile stewardship. The Advance Simulation and Computing (ASC)-funded Gate-Based Quantum Computing (GBQC) project is focused on understanding the prospects for QC platforms to eventually have significant impacts on the unique problems of stockpile stewardship. In this context, quantum simulation is a key capability.
Sandia’s stockpile stewardship mission requires models for the behavior of materials in extreme conditions that are both challenging and expensive to evaluate experimentally. GBQC is focused on understanding what will be required to realize a simulation capability that would be exceptionally impactful to ASC and the broader DOE. Recent research directions have broadened the scope of this work to understand the impacts that QCs might have on numerical linear algebra, which is a key capability for not only ASC applications, but most computational science.
Sandia has spent fifteen years developing a strong program in QIS and QC to better serve DOE and NNSA customers. As a result, Sandia is poised to be a leader in the fields of QIS and QC research, while integrating capabilities across the whole QC stack.
This is a syndicated post. Read the original post at Source link .