We’re kicking off our exploration of the next breakthroughs in science and technology with a news cycle favorite that’s swimming in misunderstanding: quantum computing.
To learn where we’re going, let’s start with where we are: The device you are using to read this encodes information in bits—aka binary digits—measured as 0s or 1s. A quantum computer uses qubits—aka quantum bits.
- Qubits can store information as 0 and 1 at the same time thanks to something called superposition.
- Qubits also exhibit a phenomenon called entanglement, in which each qubit, closely linked with other qubits, cannot be understood in isolation.
Why it matters: There are some problems so big they can stump the best supercomputer in the world. But a scalable, fault-tolerant (meaning it’s capable of operating with an error rate below a certain threshold) quantum computer that uses superposition and entanglement will be able to get the job done.
But building that is really hard and may still take decades to accomplish. That’s because to encode qubits, you need control at the atomic level. Today’s quantum computers have 50–100 qubits, but experts estimate we’ll need hundreds or thousands times that to outperform existing computer systems (“quantum advantage”).
Boil it down
The promise: Quantum computers bypass the limits of binary systems and, one day, they’ll make supercomputers look like abacuses.
The roadblocks: Quantum computers today have 50–100 qubits. We need hundreds of thousands, but it’s hard to stabilize qubits and prevent outside interference.
The projected timeline: Don’t listen to headlines. It’ll be years (or decades) before we have a scalable, fault-tolerant quantum computer.
The major players: Big Tech (Google, Microsoft, IBM, Intel), startups (Rigetti, IonQ, D-Wave), and governments (China, U.S., Canada, EU).
This is a syndicated post. Read the original post at Source link .