Multiply 1,048,589 by 1,048,601, and you’ll get 1,099,551,473,989. Does this blow your mind? It should, maybe! That 13-digit prime number is the largest-ever prime number to be factored by a quantum computer, one of a series of quantum computing-related breakthroughs (or at least claimed breakthroughs) achieved over the last few months of the decade.
An IBM computer factored this very large prime number about two months after Google announced that it had achieved “quantum supremacy”—a clunky term for the claim, disputed by its rivals including IBM as well as others, that Google has a quantum machine that performed some math normal computers simply cannot.
An arcane field still existing mostly in the theoretical, quantum computers have done enough recently and are commanding enough very real public and private resources to be deserving of your attention—not the least of which is because if and when the Chinese government becomes master of all your personal data, sometime in the next decade, it will be because a quantum computer cracked the encryption.
Building the quantum computer, it is said, breathlessly, is a race to be won, as important as being the first in space (though, ask the Soviet Union how that worked out) or fielding the first workable atomic weapon (seems to be going OK for the U.S.).
And so here is a post—written in terms as clear and simple as this human could muster—summing up these recent advances and repeating other experts’ predictions that the 2020s appear to be the decade when quantum computers begin to contribute to your life, by both making slight improvements to your map app, and powering artificial intelligence robust and savvy enough to be a real-life Skynet.
First, the requisite introduction to the concept. Normal computers, such as the device you are using to access and display this content, process information in a binary. Everything is either a one, or a zero, or a series of ones and zeroes. On, or off. But what if the zero was simultaneously also a one? (Please exit here for your requisite digression into quantum physics and mechanics.)
The idea that a value can be a zero, or a one, or “both” at the same time is the quantum principle of “superposition.” Each superposition is a quantum bit, or qubit. The ability to process qubits is what allows a quantum computer to perform functions a binary computer simply cannot, like computations involving 500-digit numbers. To do so quickly and on demand might allow for highly efficient traffic flow. It could also render current encryption keys mere speedbumps for a computer able to replicate them in an instant.
Why hasn’t this been mastered already, what’s holding quantum computers back? Particles like photons only exist in quantum states if they are either compressed very, very small or made very, very cold—with analog engineering techniques. What quantum computers do exist are thus resource-intensive. Google’s, for example, involves metals “cooled” (the verb is inadequate) to 460 degrees below zero, to a state in which particles behave in an erratic and “random” fashion akin to a quantum state.
And as Subhash Kak, the regents professor of electrical and computer engineering at Oklahoma State University and an expert in the field, recently wrote, the “power” of a quantum computer can be gauged by how many quantum “bits,” or qubits, it can process. The machines built by Google, Microsoft, Intel, IBM and possibly the Chinese all “have less than 100 qubits,” he wrote. (In Google’s case, the company claims to have created a “quantum state” of 53 qubits.)
“To achieve useful computational performance,” according to Kak, “you probably need machines with hundreds of thousands of qubits.” And what qubits a quantum computer can offer are notoriously unstable and prone to error. They need many of the hard-won fixes and advancements that saw binary computers morph from room-sized monstrosities spitting out punch cards to iPhones.
How fast will that happen—can it happen?
Skeptics, doubters, and haters might note that Google first pledged to achieve quantum supremacy (defined as the point in time at which quantum computers are outperforming binary computers) by the end of 2017—meaning its achievement was almost two full years behind schedule, and meaning other quantum claims, like Dario Gil of IBM’s pledge that quantum computers will be useful “for commercial and scientific advantage” sometime next year, may also be dismissed or at least subject to deserved skepticism.
And those of us who can think only in binary may also find confusion in the dispute between quantum rivals. The calculation performed by Google’s Sycamore quantum computer in 200 seconds, the company claimed, would take a normal binary supercomputer 10,000 years to solve. Not so, according to IBM, which asserted that the calculation could be done by a binary computer in two and a half days. Either way, as The New York Times wrote, “quantum supremacy” is still a very arcane experiment “that can’t necessarily be applied to other things.” Google’s breakthrough might be the last achievement for a while.
But everybody is trying—including the U.S. government, which is using your money to do it. Commercial spending on quantum computing research is estimated to reach hundreds of millions of dollars sometime in the next decade. A year ago, spooked and shamed by what appeared to be an unanswered flurry of quantum progress in China, Congress dedicated $1.2 billion to the “National Quantum Initiative Act,” money specifically intended to boost American-based quantum computing projects. According to Bloomberg, China may have already spent 10 times that.
If you walk away with nothing else, know that quantum computer spending is very real, even if the potential is theoretical.
This is a syndicated post. Read the original post at Source link .