Futurist Isaac Arthur explains why quantum computing is a lot more complicated than traditional.
Dan Patterson, a Senior Producer for CBS News and CNET, interviewed futurist Isaac Arthur about the myths and realities of quantum computing. The following is an edited transcript of the interview.
Isaac Arthur: We will have another Moore’s Law equivalent where it doubles every year. Then there might be some phases where we do that. I would not be surprised if it was doubling every year right now.
The notion of doing a quantum computer that had a million entangled bits is–I don’t want to call it fantastic, and people will always tend to say, well we know; think about what happened with computers and transistors. Trying to keep a whole bunch of things like that untangled is quite a lot different.
SEE: Special report: Prepare for serverless computing (free PDF) (TechRepublic)
It’s the difference between trying to keep a choir of five people on a chord, and trying to keep a million people singing all at the same time. Those have to stay entangled, and you have a very short window while they’re entangled to actually take that photo. There’s not a very long period of time of coherence, and it all has to be done at very super cold temperatures.
That’s just the hardware end of making sure it happens. Then of course, there is, as we mentioned, that whole algorithm question. You can’t solve a problem like this unless you know what to ask. The ability to randomly look at every page of a book in a library is great. If you don’t know how to find out which one of them pulled the right result, though, it doesn’t help you much.
It has great options for things like searching, as we said earlier, but that all has to be taken in the context of when is it going to be faster than the class computer. Until a quantum computer starts getting to the point where it can really start having many millions of qubits entangled simultaneously, that’s probably not going to happen.
Even then, when we start hearing about ones that had a million qubits, it’s more likely that it would have packages of about a thousand that were entangled instead of one that was actually a million. You have modules to a certain size. The question will be, what is the biggest module we can make?
Then you can start putting them in payload, just like we do with other computers, but they won’t be rising exponentially. You wouldn’t have to the 1-millionth numbers of options available to you. You’d just have 2-to-1,000 times–1,000 which again, would be an insanely huge number itself, but no way near as big.
Watch more interviews with Dan Patterson and Isaac Arthur
This is a syndicated post. Read the original post at Source link .