Are the fastest computers we’re using today the fastest computers we’ll ever use? Though largely theoretical and abstract, quantum computing is gaining attention as a possible future to impossibly fast computers.
A distillation of Moore’s Law–named for Gordon Moore, co-founder of Intel and a titan in the history of computing science–states that computing power doubles every two years. When Moore first made this assertion, in 1965, he believed the timeframe to be 18 months, but revised the figure in 1975. This was based on his assessment that the number of computer components in an integrated circuit had doubled at about that rate. This directly correlated to the processing power of a semiconductor, and has grown to be the target for miniaturization in that industry ever since. While the pace of advancement has slowed, Moore’s prediction has held up for decades.
At a certain point, though, shrinking the physical world gets harder and harder–until you reach the point that conventional physics begin to find their limits. The cell phones of today are more powerful than banks of–or even rooms of banks of–mainframe computers. Today’s transistors (the basic physical computing device inside of modern electronics) have replaced vacuum tubes that served the same purpose, yet were the size of your hand. Modern computer chips have billions of these transistors that are practically the size of an atom. At this point, the only way to go is smaller, and that means atomic–or subatomic–processing engines. Truly “angels dancing on the head of a pin” territory.
Bits, Quantum Science and Qubits
Transistors can be either on or off. An “on” value represents “1” and an “off” value represents a “0” value. You can string together these ones and zeros to denote numbers in the language known as binary. The core component of binary are these individual ones and zeros, which are referred to as bits. Today’s conventional computing is locked into these transistors and their physical design. This means it’s also bound by the laws of physics.
Which brings us to quantum computing, which takes its name from the branch of physics known as quantum theory. Quantum theory involves the physics of atoms and subatomic particles. Quantum physics is a topic that is impossible to distill into a single paragraph, or even a single article, but at a high level I’ll give it a shot with a single sentence: Small things behave strangely.
There is the classic thought experiment that is often used to simplify the concepts of quantum mechanics that even the least scientific person may be aware of, thanks to the popularity of the television show The Big Bang Theory: the tale of Schrödinger’s Cat. In 1935, Erwin Schrödinger, an Austrian physicist, suggested that if he were to place a cat and a flask of poison into a sealed box, and then break the flask, that the cat is both alive and dead until the true state of the cat is observed–by opening the box and checking on the cat. It speaks to the concept of an object having multiple states, and of the idea that observation collapses the state into a distinct value. This concept of multiple states manifests itself in the real world. For example, light can behave as both waves of energy and as particles of energy.
Further, physics has a concept known as superposition. According to Wikipedia, “Quantum superposition is a fundamental principle of quantum mechanics. It states that, much like waves in classical physics, any two (or more) quantum states can be added together (“superposed”) and the result will be another valid quantum state; and, conversely, that every quantum state can be represented as a sum of two or more other distinct states.” It’s a fancy way of saying that you can combine multiple states into a new state, but can still dissect that final state to determine the parts of the sum.
But what do potentially poisoned pets and light waves have to do with computing?
Imagine being able to use something similar to a bit, but instead of only having a single value, of 0 or 1, you could store 0, 1, 0 and 1, or infinite values in between? Even better, what if this “something” could store multiple values (think multiple “states”) at the same time? That is the core concept behind quantum computing. In quantum computing, the “subatomic transistor” is so small it is theorized to behave under the rules of quantum physics versus conventional physics. A quantum computing equivalent of a bit could be enlisted to store multiple values simultaneously, with the act of observing the states causing a collapse into a computed value. That quantum computing bit’s equivalent is called a qubit.
Just as a quantum computer could theoretically store multiple values at once, so could it process values simultaneously. Similar to the act of parallel processing in today’s modern relational database systems, parallel processing performs much better than serial processing. In quantum computing, the factor of millions of times faster processing over conventional computing is bandied about.
Progress and Challenges
While there’s a long way to go before quantum computing is a household term–let alone before the technology is powering mainstream devices–progress is being made.
In 2000, researchers at IBM’s Almaden Labs created a 5-qubit quantum computer from 5 fluorine atoms. Later that same year, The Los Alamos National Laboratory created a 7-qubit machine. In 2015, Los Alamos began serious work on a 1,000-plus-qubit quantum computer. And, just last year, Microsoft developed and released a Quantum Development Kit along with a new Q# programming language specifically for quantum computing. Microsoft is also offering an Azure-based simulator that can simulate more than 40 qubits of computing power.
We’ve spent decades perfecting conventional computing to get to where we are. Quantum computing is going to require the ability to move beyond the theoretical ideas of the concepts of quantum computing laid out in the 1970s. We need to figure out what to make qubits out of, how to control them, and how to get data in and out. There is also the problem with observing quantum processes and the effects observation has on quantum physics (collapsing states into a conclusion.)
With that said, while these challenges are daunting, we’re seeing progress toward a possible world of supercomputing beyond our imagination. It sounds like the stuff of science fiction, but, as with so many things–from cell phones to space travel–science fiction has a way of turning into science fact.
This is a syndicated post. Read the original post at Source link .