Paul Benioff (born 1930) is a US physicist who in 1980 imagined the feats computing might achieve if it could harness quantum mechanics, where the word quantum refers to the tiniest amount of something needed to interact with something else â itâs basically the world of atoms and sub-atomic particles. Benioffâs imagination helped give rise to the phrase âquantum computingâ, a term that heralds how the storage and manipulation of information at the sub-atomic level would usher in computing feats far beyond those of âclassicalâ computers.
Benioff was coincidently thinking about a vague concept being outlined by Russian mathematician Yuri Manin (born 1937). Since then, many others have promoted the potential of computing grounded in the concept of âsuperposition,â when matter can be in different states at the same time.
Quantum computing is built on manipulating the superposition of the qubit, the name of its computational unit. Qubits are said to be in the âbasis statesâ of 0 or 1 at the same time when in superposition, whereas a computational unit in classical computing can only be 0 or 1. This qubit characteristic, on top of the ability of qubits to engage with qubits that are not physically connected (a characteristic known as entanglement), is what proponents say gives quantum computers the theoretical ability to calculate millions of possibilities in seconds, something far beyond the power of the transistors powering classical computers.
In 2012, US physicist and academic John Preskill (born 1953) devised the term âquantum supremacyâ to describe how quantum machines one day could make classical computers look archaic.
In October last year, a long-awaited world first arrived. NASA and Google claimed to have attained quantum supremacy when something not âterribly usefulâ was computed âin seconds what would have taken even the largest and most advanced supercomputers thousands of yearsâ. The pair were modest that their computation on a 53-qubit machine meant they were only able âto do one thing faster, not everything fasterâ. Yet IBM peers doused their claim as âgrandiosityâ anyway, saying one of IBMâs supercomputers could have done the same task in two-and-a-half days.
Nonetheless, most experts agreed the world had edged closer to the transformative technology. Hundreds of millions of dollars are pouring into research because advocates claim quantum computing promises simulations, searches, encryptions and optimisations that will lead to advancements in artificial intelligence, communications, encryption, finance, medicine, space exploration, even traffic flows.
No one questions that practical quantum computing could change the world. But the hurdles are formidable to accomplish a leap built on finicky qubits in superposition, entanglement and âerror correctionâ, which is the term for overcoming âdecoherenceâ caused by derailed qubits that canât be identified as out of whack when they are in superposition. Thereâs no knowing as to when, or if, a concept reliant on mastering so many tricky variables will eventuate. While incremental advancements will be common, the watershed breakthrough could prove elusive for a while yet.
To be clear, quantum computing is expected to be designed to work alongside classical computers, not replace them.Â
Quantum computers are large machines that require their qubits to be kept near absolute zero (minus 273 degrees Celsius) in temperature, so donât expect them in your smartphones or laptops. And rather than the large number of relatively simple calculations done by classical computers, quantum computers are only suited to a limited number of highly complex problems with many interacting variables. Quantum computing would come with drawbacks too. The most flagged disadvantage are the warnings that a quantum computer could quickly crack the encryption that protects classical computers. Another concern is that quantum computingâs potential would add to global tensions if one superpower gains an edge. The same applies in the commercial world if one company dominates. Like artificial intelligence, quantum computing has had its âwintersâ â when its challenges smothered the excitement and research dropped off.
That points to the biggest qualification about todayâs optimism about quantum computing; that it might take a long time to get beyond todayâs rudimentary levels where quantum machines are no more powerful than classical supercomputers and canât do practical things. But if quantum computing becomes mainstream, a new technological era would have started.
By Michael Collins, Investment Specialist
We believe that successful investing is about finding, and owning for the long term, companies that can generate excess returns on capital for years to come. To keep up to date with Magellan’s latest thinking, hit the follow button below.
This is a syndicated post. Read the original post at Source link .