Nobel Prize-winning physicist Frank Wilczek explores the secrets of the cosmos. Read previous columns here.
Precision is a powerful tool, but it can be hard to come by. That theme, with variations, is a leitmotif of science, organic life and modern technology. It is sounding again today, at the frontier of quantum computing.
Consider biology. Complex organisms store their essential operating systems—instructions for how to build cells and keep them going—within long DNA molecules. Those basic programs must be read out and translated into chemical events. Errors in translation can be catastrophic, resulting in defective, dysfunctional proteins or even in cancers. So biology has evolved an elaborate machinery of repair and proofreading to keep error rates low—around one per billion operations. A series of complicated molecular machines examine the progress and correct mistakes, in a process aptly called proof-reading. The creation of this machinery is one of evolution’s greatest achievements.
Many applications of computers also need precision. (For instance, in bank transactions it’s important to get passwords and transfers exactly right!) Modern computer technology came into its own when small, reliable solid-state transistors became available. Here, the basic distinction between “0” and “1” gets encoded in two alternative locations for buckets of electrons. When there are many electrons per bucket, errors in the position of one or a few don’t spoil the message.
We can try to exploit the complexity of matter that quantum theory reveals for useful purposes.
But in doing computations the computer must move the buckets around. Making those buckets of electrons smaller makes the job of moving them around easier. Indeed, the computer industry’s spectacular record of ever-faster speed is largely the story of lowering the number of electrons used to make a bit; nowadays we’re approaching ten or fewer. Unfortunately, at this frontier the near error-immunity that stems from having many “redundant” electrons is less automatic. To maintain nearly error-free, precise operation, new tricks will be necessary.
Fundamental physics brings in another issue. As you approach the level of single electrons, the effects of quantum mechanics become more prominent. This is both a challenge and, potentially, a blessing.
In quantum mechanics, we learn that electrons do not have definite positions but rather distributions of probability. This further blurs the crucial 0-1 distinction. The potential blessing is the flip side of that coin. We can try to exploit the complexity of matter that quantum theory reveals for useful purposes. This is the vision of so-called quantum computing. Vigorous research efforts aimed at providing useful platforms for quantum computing are in progress around the world.
The great challenge is to reach high precision. People are pursuing two kinds of strategies, broadly parallel to those in biology and classical computing.
The first, quasi-biological approach is to let some errors happen but to work hard to correct them. Unfortunately, good error correction requires redundancy and lots of complex machinery, so this path to precision is painful. The second, called topological quantum computing, is an avant-garde technology. It is premised on making buckets of energy that have a kind of memory—so-called “anyons.” People hope to construct new kinds of transistors that use anyonics rather than electronics. But only very recently have physicists succeeded in producing anyons at all, and it remains to be seen if they can be put to practical use.
Can we continue to meet the challenge of precision? Nature herself inspires faith, for Nature “manufactures” perfectly interchangeable parts (e.g., electrons) in vast quantities and “calculates” their behavior—with perfect accuracy.
Copyright ©2020 Dow Jones & Company, Inc. All Rights Reserved. 87990cbe856818d5eddac44c7b1cdeb8
This is a syndicated post. Read the original post at Source link .