Building a quantum computer is like building a cathedral. They both take a couple generations.
The time frame for useful quantum computing applications that are not toy-sized is still a few years to a decade or more away. But the push is on now. Governments are racing to get their country’s quantum computing going for national security reasons. Companies such as Google and IBM are competing for bragging rights and the spoils for having quantum computing available online, even though the number of qubits is still pretty low.
The industry is starting from a practical point of view by dealing with the devils that are known. For instance, in IC design and manufacturing, Intel is focusing on silicon that can be made in its existing fabs, and current electronic design automation tools are supporting quantum research as is.
It’s too early to solve the really sticky issues where quantum computing might require new or tweaked tools for IC design and verification. For example, error rates increase when more qubits are operating at once, and and those calculations are susceptibility to external noise (vibrations and temperature).
“In the semiconductor manufacturing space, at this point, we are not seeing any issues,” said Juan Rey, vice president of engineering at Mentor, a Siemens Business. “From the semiconductor manufacturing point of view, there doesn’t seem to be a need to use some of the most advanced semiconductor processing techniques out there.”
The challenges Rey sees today are in the materials, and getting consistency in the results. “It’s making sure that the processes work in the way that we need, but not on the interaction between what traditionally the semiconductor manufacturing needs at the interface between design and manufacturing. The focus is a lot more on the verification aspect — physical verification, but it could be electrical verification also. That is the major focus.”
Rey sees the challenges in EDA for quantum computing showing up when the number of qubits increase and more devices are integrated. “When companies go to large numbers of integrated devices, they may have some challenges that they are not having today because of having a very small number of qubit retention — and at the system level due to the interaction between the areas that absolutely must be at a very low temperature with the rest of the system.”
So in effect, the goal is to start with what is known and proven, and move from there. “Our first focus then is to make sure that any physical verification needs are satisfied,” Rey said. “From that point of view, without naming customers, there are a few customers that use Calibre for physical verification. And they range from being very secretive and not disclosing anything in terms of issues or concerns that they may have, to the more open and working with us on incorporating some of the advanced features that we have been developing over a year, including some complex physical verification checks, or even some electrical verification checks. Our first focus then is to make sure that we are not the reason why they cannot accomplish their technical objective in the area where we are strong, which are in principle both physical verification as well as semiconductor manufacturing.”
The resources of a big company
This approach also allows small teams to be agile enough to explore solutions, but also to be able to tap the resources of large companies. This is the case with Intel’s Components Research Division, which can translate into a companywide team when necessary.
“We tap into our entire infrastructure here for manufacturing and technology development R&D,” said Ravi Pillarisetty, senior research scientist who works on quantum computing in the Components Research Division at Intel. “So let’s say we tap into a huge army of people who make our production test chips. They help us make our test chip here. In the fab, we tap into all of the module and process infrastructures. It effectively becomes a very large team, because we tap into the power of our fab and our infrastructure. If you look at our program, we span the entire stack — all the way from the components research side, which is doing materials and devices and process and interconnects. But we are off in the stack. When you get to architecture, software, and application algorithms, that’s done by a team that’s managed in Intel Labs. Intel Labs is where we do all of our forward-looking compute research. So they’re kind of a brother to the Components Research on the process side. Obviously, we have a very diverse team here, where we have experts in devices and transistors, and experts coming specifically from new hires, for example, with real quantum experience really doing qubit experiments on the qubit side, and then we’ve got experts in integration and actually how to build stuff in our fab. And then working to the lab side, you’ve got experts in cryogenic — how we do cryogenic circuit design like the cryo control chip. And then as you go up, the applications algorithms space is very unique because you’re actually bridging computer science and a theoretical kind of quantum physics. That’s a unique skill set. Across the stack, you need a very diverse team. And as you go forward, you need to hire a lot of new PhDs who are coming out.”
Even the QEDC, a new government-sponsored industry consortium for quantum computing, points out the need and asks how to create a quantum workforce. “That’s when you look at a lot of this money coming in for this national quantum initiative,” said Pillarisetty. “A big charter there is to make sure that the students are actually getting trained so that when this workforce goes forward and scales up, you will have the actual workforce with the expertise coming in to actually meet the needs once this will become manufacturable.”
Sticking with qubits and silicon manufacturing
First the most practical thing is to stick with something you know and something that it is scalable. “We’re going to need millions of qubits. So we need a scalable system,” said Pillarisetty. The superconducting qubit seems most scalable. “We do have two programs, and the way we see the superconducting qubit is the most mature right now. You can make the largest number of qubits on the chip. And we can use those right now to actually run real algorithms on and get and get full-stack learning. We don’t see the superconducting bits as being scalable. So while they do provide us the mature kind of system, and something we can use to learn at the system level, we really see spin qubits as the long-term thing that we need to make work. And that plays to our strengths because they’re very similar to transistors.”
Scaling of qubits will take some time, however. “Our focus at this point really isn’t a numbers game of because, whether it’s 20 qubits, 50 qubits, 100 qubits, at the end of the day you’re going to need millions,” he said. “We’re looking at this as a long-term play that eventually we have to make millions, and we see at the qubit level, or device levels, spin qubits have this path. “They fit with our manufacturing flow. If you look at prototype data that’s been made in universities, there’s a proof of concept, the device works, it exists. It’s a matter of, let’s say, more of an engineering challenge for how we scale that and fit that into a manufacturing framework and do the high volume R&D to solve all of the associated integration device kind of issues. So that that’s our focus.”
Silicon-based qubits take some of the guesswork and newness out of the quantum challenge because they are based on processes and techniques that have been refined over decades. “If you look in spin qubits, you’ll feel that you can make quantum dots in almost any kind of semiconductor system compared to III-V systems. Those can’t be isotopically purified, so they have nuclear spin interaction problems. Those won’t work. You need to go to a silicon-based system. If you look at the spin qubit community, the two qubits that are out there are silicon, something that is like a silicon MOSFET, which is what I was presenting in our internal work at Intel. A lot of the external university groups also use these quantum well systems, where they have silicon germanium sandwiched in between, or the silicon sandwiched in between silicon germanium barriers. That’s a quantum well kind of technology, similar to III-Vs. With those you can get much higher mobilities because you space the device away from the gate. But right now, if you look at the actual benchmarking of the devices, the final error rates or fidelity metrics are pretty similar between the two.”
Silicon is easier to integrate into existing fabs, too. So if silicon does the job, it’s that much simpler.
“You need to have a strong value proposition to move to a different material for qubits, because there are so many challenges associated with the qubits,” Pillarisetty said. “If I could stay with silicon, that’s makes the number of new things or issues I have to solve much easier and more manageable.”
Not to be ignored
The entire semiconductor industry is taking quantum seriously, even if the timing is rather hazy.
“We pay a lot of attention, and very close attention, to both the startups and the large, well-established companies that need to use our tools to accomplish their very different goals,” said Mentor’s Rey. “We have been following quantum computing for the several years as part of the Stanford System X, in which we participate and we fund. We look at quantum computing from two different directions. One is that the industry has been moving very fast in this space in terms of the number of startups and very large, well-established companies that are taking quantum computing seriously to the point of doing large investments into that space.”
What’s clear, though, is the industry hasn’t zeroed in on a single best approach yet, and it is using what it knows best in the interim — and maybe forever.
“The current generation of quantum computers can run very small toy-sized examples of this problem, but they can’t do full-scale industry problems or provide any kind of a speed-up,” said Matt Johnson, founder and CEO, QC Ware, a company that makes algorithms for quantum computing. “It’s not yet there. Despite that, leaders in industry have decided that quantum computing needs to be on their strategic roadmap. And they need to be investing research and development dollars to understand how this is impacting their industry. They anticipate this the same way that that the leaders of R&D and the CTOs and every self-respecting large enterprise are looking at other emerging technologies. It just so happens that quantum computing is gaining. People are gaining a deeper understanding of all of the implementations of it further for their business. So what we like to say is that just as every corporation is investing research dollars to understand the implications of other technologies, doing this for quantum computing is, in a sense, purchasing an insurance policy against being disrupted by this technology.”
That opinion is far from unique, given the intense focus on this futuristic technology. “The time to look to advance research is now,” said Mentor’s Rey. “The time to work and support the companies that are developing systems is now.”
This is a syndicated post. Read the original post at Source link .