/Why would you want an on-premise quantum computer? (via Qpute.com)
Why would you want an on-premise quantum computer?

Why would you want an on-premise quantum computer? (via Qpute.com)


Quantum computing is advancing at a rapid pace, but forever feels five to ten years away. Though most of the companies in the space regularly claim new qubit milestones, the field still largely remains in the research phase, with companies tentatively testing quantum capabilities and use cases through cloud portals.

However, that is beginning to change. Last month IBM announced its first private sector on-premises quantum computer deployment in the US at the Cleveland Clinic medical center in Cleveland, Ohio.

As part of a 10-year partnership to establish a joint research center with the aim of advancing the pace of discovery in healthcare and life sciences through the use of high-performance computing on the hybrid cloud, AI, and quantum computing, IBM will install the first on-premise System One in the United States in 2022.

“Having this technology on-site paired with IBM teams and scientists will allow both organizations to work closely together and truly move medicine forward,” Lara Jehi, MD, Cleveland Clinic’s chief research information officer, tells DCD. “As leaders in research and healthcare, we want to shape the future and ask the research questions ourselves so we can define the uses of quantum in healthcare for us and the rest of the biomedical research and healthcare communities.”

Jehi says the new machine will enhance research in Cleveland Clinic’s new Global Center for Pathogen Research & Human Health in areas such as genomics, population health, clinical applications, and chemical and drug discovery.

Deploying an on-premise quantum computer

First announced in 2019, the System One is a self-contained quantum computer enclosed in a nine-foot sealed cube, made of half-inch thick borosilicate glass. Up until now, IBM had only installed quantum systems at its own facilities – more than 20 have been deployed – and then made them available for use through its Q Network cloud service.

Bob Sutor, chief quantum exponent at IBM, tells DCD the two companies are in the planning stages right now, but the system installed will be the ‘latest generation of the device’ with between 50-100 qubits, but didn’t share power requirements for the system.

“IBM has extensive experience installing quantum systems in its own sites, and we will work with the Cleveland Clinic to configure the space for their system,” he adds. “IBM will work with Cleveland Clinic staff to train them and support the installation.”

The healthcare provider has a 3MW, 263,000 sq ft data center in Brecksville, to the south of the city, but this will not be where the new System One will live. Jehi says that while the exact location of where the quantum machine will be installed is still under discussion it will be on “a secured site on Cleveland Clinic’s main campus in Cleveland.”

There a number of companies in the quantum space – large technology players such as Google and Microsoft, defense companies such as Honeywell, as well as startups including IonQ and D-Wave. However, it was Cleveland Clinic’s experience of using the Q Network service along with IBM’s focus on AI and healthcare that were some of the deciding factors in opting for Big Blue’s quantum hardware.

“Cleveland Clinic researchers recently used IBM quantum computing to discover that a type of estrogen may be beneficial against Covid-19,” explains Jehi. “Through an analysis on the quantum computer, they systematically evaluated possible interaction between toremifene and SARS-CoV-2 viral protein structures, and demonstrated that toremifene blocks the spike protein to inhibit viral infection, thus making it a potential treatment for Covid-19.”

Try before you buy quantum computing

A number of companies including BMW and Roche are running trials with quantum computing to see if it can provide previously hidden insight and deliver on its nascent potential. But given the still-experimental nature of quantum computing and the lack of low latency requirements for HPC workloads, DCD was curious why the Clinic bothered with deploying an on-premise machine when the cloud service seemed to be working well.

“The cloud offering allows us to be users, like the rest of the research community,” says Jehi. “The on-premise system provides us with a chance to be leaders, creators, and developers with IBM.”

She adds the skills and workforce component was also important; the collaboration includes plans around education and workforce development around quantum computing, including training and certification programs in data science and quantum computing and holding joint research symposia and workshops.

As part of the agreement, IBM will provide Cleveland Clinic access to research and commercial technologies including RoboRXN to help scientists design and synthesize new molecules remotely; the IBM Functional Genomics Platform to accelerate discovery of molecular targets for drug design, test development and treatment; Deep Search, which helps researchers access structured and unstructured data quickly; and High-Performance Hybrid Cloud Computing for when researchers need to burst workloads into the cloud.

Though potentially still powerful, 50-100 qubits is still limited when compared to what quantum machines will be capable of in the future. Also as part of the agreement, Cleveland Clinic will also be home to the world’s first on-premise Q System Two – IBM’s ‘next generation 1,000+ qubit’ system – but the timeline of arrival is still being finalized. IBM has previously said it hopes to reveal the system in 2023. IBM says its largest quantum computer is currently capable of containing 65 qubits, and the company plans to release its 127-qubit IBM Quantum Eagle processor sometime in 2021.

“We felt that with IBM’s strength in high-performance computing, AI and quantum computing technologies and Cleveland Clinic’s expertise in healthcare research, we could together build a robust research and clinical infrastructure to empower big data medical research and accelerate the scientific discovery process,” surmises Jehi. “Cleveland Clinic and IBM share a commitment towards research and education that is front and center in their mission.”

It is still hard to judge the performance of different quantum computing systems and approaches – both against each other, and against conventional computing. In 2019 Google claimed that it had outperformed the world’s fastest system on a very select workload, doing in a few minutes what it said Summit would take 10,000 years to solve. But not long after, IBM claimed that with optimization the workload could be done in 2.5 days on Summit – and with far greater fidelity than on Google’s Sycamore.

Similar quantum computing milestones have been met with similar rebuttals, with competitors and detractors noting that conventional computing is still far ahead when workloads are optimized.

Each company also prefers its own benchmark to show how its quantum computing platform is ahead. IBM, for example, favors measuring quantum volume – a ‘holistic measure’ the company came up with to value different aspects of gate-based quantum computers. By its measurement, IBM is number one.

In an effort to build a broader and more unbiased ranking system, US agency DARPA this week launched the Quantum Benchmarking program.

“It’s really about developing quantum computing yardsticks that can accurately measure what’s important to focus on in the race toward large, fault-tolerant quantum computers,” said Joe Altepeter, program manager in DARPA’s Defense Sciences Office.

“Building a useful quantum computer is really hard, and it’s important to make sure we’re using the right metrics to guide our progress towards that goal. If building a useful quantum computer is like building the first rocket to the moon, we want to make sure we’re not quantifying progress toward that goal by measuring how high our planes can fly.”


This is a syndicated post. Read the original post at Source link .