Science Photo Library – PASIEKAGetty Images
- A new DNA computer calculates square roots of perfect squares up to 900.
- Like quantum computers, DNA computers are an exciting frontier of post-silicon computing.
- Where previous examples were up to 4 digits of binary, this one is 10 digits.
Scientists at the University of Rochester have used DNA to make a simple computer that calculates the roots of perfect squares up to 900, New Scientist reports. The computer works by sequencing numbers up to about 1,000 onto a strand using binary-encoded markers, and the solutions light up using fluorescence.
A new paper published in Small explains why the DNA computer is both unique and powerful. Previous DNA computer models have been able to calculate square roots with up to 4-bit binary numbers, says the team, led by researcher Chunlei Guo, in the paper’s abstract. That means just four digits worth of 0s and 1s, and a maximum value of 15 representing a range of 16 values beginning with 0. Guo’s computer calculates with 10 bits, meaning 1,024 values ranging from 0 to 1,023.
Calculating square roots of perfect squares, meaning the integers that have integer square roots, is logically complicated, but can still be done with mechanical circuitry and logic gates. Guo’s team says its computer works for squares up to 900, although there’s one more sneaky perfect square at 961, and 1024 itself is a perfect square.
DNA computing is similar in a big-scheme way to quantum computing, because both involve positioning molecules and particles as a mechanical form of computation. Guo’s team believes, similarly, that DNA computing will join quantum computing as a method that can eventually outpace silicon-chip computing.
Today, the average person’s interaction with computers feels extremely removed from the mechanics and logic that hardware is actually doing. Between our keyboards and the electric pulses conducting operations on the circuitry level, there are layers and layers of programming that grow more complicated and mathematical and less recognizable as language.
Coding in Java or even HTML often invokes entire regular words like “main” and “strong,” and these are translated into strings of binary in order to be executed. These strings grow more complex but are still executed as microbursts of electricity. It’s within this simple mechanic that DNA computing is operating. As a pulse winds its way through a circuit, it encounters gates (switches) like the ones that redirect train tracks.
DNA is a natural format to do binary calculations because the naturally cohering base pairs form an implied binary and logic path. The DNA that Guo’s team used in its computer is made from two single strands that are bonded in a process called hybridization, creating a brand new strand free of the context of cells of living things.
“It can calculate the square root of a 10 bit binary number (within the decimal integer 900) by designing DNA sequences and programming DNA strand displacement reactions,” the team says in its abstract. The changing strands and sequences trigger color codes that show what the answer is. It’s a callback to the earliest days of mechanical computers in the most future-forward way possible.
This is a syndicated post. Read the original post at Source link .