THE LITHIUM-ION BATTERY is the unsung hero of the modern world. Since it was first commercialised in the early 1990s, it has transformed the technology industry with its ability to store huge amounts of energy in a relatively small amount of space. Without lithium, there would be no iPhone or Tesla – and your laptop would be a lot bigger and heavier.
But the world is running out of this precious metal – and it could prove to be a huge bottleneck in the development of electric vehicles, and the energy storage solutions we’ll need to switch to renewables. Some of the world’s top scientists are engaged in a frantic race to find new battery technologies that can replace lithium-ion with something cleaner, cheaper and more plentiful. Quantum computers could be their secret weapon.
It’s a similar story in agriculture, where up to five per cent of the world’s consumption of natural gas is used in the Haber-Bosch process, a century-old method for turning nitrogen in the air into ammonia-based fertiliser for crops. It’s hugely important – helping sustain about 40 per cent of the world’s population – but also incredibly inefficient compared to nature’s own methods. Again, quantum computers could provide the answer.
So far, researchers have been working on these problems with blunt tools. They can perform increasingly powerful simulations using classical devices, but the more complicated the reactions get, the harder they become for supercomputers to handle. This means that right now, scientists are limited to looking only at very small problems, or they are forced to sacrifice accuracy for speed.
A hydrogen atom, for instance, has one positively charged proton and one electron and is easy to simulate on a laptop – you could even work out its chemistry by hand. Helium, next step along on the periodic table, has two protons, orbited by two negatively charged electrons – but it’s more challenging to simulate, because the electrons are entangled, so the state of one is linked to the state of the other, which means they all need to be calculated simultaneously.
By the time you get to thulium – which has 69 orbiting electrons, all entangled with each other – you’re far beyond the capability of classical computers. If you wrote down one of each of the possible states of thulium per second it would take 20 trillion years – more than a thousand times the age of the universe. In his 2013 book Schrödinger’s Killer App, John Dowling calculates that to simulate thulium on a classical computer, you would need to buy up Intel’s entire worldwide production of chips for the next 1.5 million years, at a cost of some $600 trillion.
A much quicker alternative would be to simply measure the atom directly. “Classical computers seem to experience an exponential slowdown when put to simulating entangled quantum systems,” Dowling writes.
“Yet, that same entangled quantum system shows no exponential slowdown when simulating itself. The entangled quantum system acts like a computer that is exponentially more powerful than any classical computer.” Although we’ve known all the equations we need to simulate chemistry since the 1930s, we’ve never had the computing power available to do it. This means that often, when dealing with complex simulations that are intractable for classical computers, the best approach is still to simply try lots of different things in the real world and draw conclusions from observation and experiment.
“We can’t really predict how electrons are going to behave right now,” says Zapata’s Christopher Savoie. “If we can get into a world where we’re simulating it on a computer, we can be more predictive and do fewer actual laboratory experiments.” It is, he says, as if Airbus were still testing planes by building small-scale models and throwing them into the sky. “You cannot simulate chemical processes that you’re interested in,” says Google’s Sergio Boixo. “With a lot of the low-level materials science and engineering, you’re kind of blind.”
This is a syndicated post. Read the original post at Source link .