Google quantum computer prototype with nine superconducting qubits. (Image courtesy of Google Research).

Google Pioneers Hybrid Approach to Quantum Computing

June 17, 2016
Scientists from Google and the University of Basque Country in Spain believe they have cleared some of the barriers to more complex and useful quantum computers.

The 8 Ball supercomputer, a spherical monolith that can harness the rules of quantum mechanics to solve vastly complex problems, is not real. It’s actually a quantum computer that appeared in a short story written by science fiction novelist Gregory Dale Bear last year.

Such computers have yet to escape the realm of science fiction, but recent advances have moved the prospect of a working quantum computer closer to reality. Scientists from Google and the University of Basque Country in Spain believe they have cleared some of the barriers to more complex and useful quantum computers.

The technology is based on the idea of quantum bits, or qubits, which loosely correspond to the classic bits stored inside the transistors etched onto silicon. The qubits are capable of encoding more data than the 1’s and 0’s that form the basis of modern computing. That is because the qubits are subject to superposition, a principle that enables them to save 1, 0, or both at the same time.

Today’s quantum computers contain only a few qubits, a far cry from the billions of transistors that Intel can carve into silicon chips. Earlier this year, IBM built a quantum computer using five superconducting qubits and allowed researchers to access it over the cloud. The Canadian startup D-Wave has built a 1097-qubit machine, but it can apparently only run an extremely narrow range of algorithms faster than Intel chips.

The Basque and Google scientists believe the secret to building a universal quantum computer with more qubits lies in merging the field’s two main strands of research. The first involves linking the qubits into primitive circuits that can perform one type of operation to solve very specific problems.

The bulk of quantum computing theory—and the vast funds from government agencies and private research – is focused on this approach. Google has studied this digital approach out of its Quantum Artificial Intelligence Lab, where it has also installed quantum computers from D-Wave to experiment with.

The D-Wave machines are based on the second major approach known as adiabatic quantum computing. In that case, the qubits are arranged in a certain pattern and gradually change their interactions until they can solve an algorithm. Unlike digital computers, which must be programmed for specific calculations, almost any problem can be fed into the evolving array of qubits.

Both types of quantum computing are susceptible to random noise that leaks into the system and derails calculations. The digital computers can filter out the noise with added hardware and software. But it is significantly more difficult to dispel the noise from adiabatic devices, according to Remi Barends, a member of the Google team.

The research team, which published its results in the journal Nature, thinks that adiabatic computing can be improved. The idea was to adapt the error-correction abilities of digital computers to an adiabatic machine. If the quantum computer can block out interference, it can support more qubits without flooding the system with extra noise, Barends said.

The team’s prototype seems almost gracefully simple compared to the underlying physics. Its nine aluminum qubits are cooled to nearly absolute zero, at which point they behave like superconductors. In that state, where the film has no electrical resistance, data can be encoded onto the qubits. The interactions between the qubits are controlled by 1,000 logic gates, manipulating the data in order to run algorithms.

The scientist said that in a couple of years, the new approach could lead to devices with more than 40 qubits. Those quantum computers might solve problems that conventional computers are simply not powerful enough to do. That includes large-scale simulations in chemistry, mapping the interactions between molecules and materials to form models of complex diseases like cancer. It could also be used to create stronger encryption for software.

Companies like Google and IBM believe that quantum computing will also have a profound effect on machine learning. Google is feeding artificial intelligence into all its services and has already plugged special hardware for machine learning into its data centers. Watson, IBM’s artificial intelligence, is already working in areas like medicine and digital security.

“Machine learning is all about building better models of the world to make more accurate predictions,” Harmut Neven, a quantum computing engineer at Google, wrote in a blog post on the Quantum Artificial Intelligence Lab.

Looking for parts? Go to SourceESB.

Sponsored Recommendations

What are the Important Considerations when Assessing Cobot Safety?

April 16, 2024
A review of the requirements of ISO/TS 15066 and how they fit in with ISO 10218-1 and 10218-2 a consideration the complexities of collaboration.

Wire & Cable Cutting Digi-Spool® Service

April 16, 2024
Explore DigiKey’s Digi-Spool® professional cutting service for efficient and precise wire and cable management. Custom-cut to your exact specifications for a variety of cable ...

DigiKey Factory Tomorrow Season 3: Sustainable Manufacturing

April 16, 2024
Industry 4.0 is helping manufacturers develop and integrate technologies such as AI, edge computing and connectivity for the factories of tomorrow. Learn more at DigiKey today...

Connectivity – The Backbone of Sustainable Automation

April 16, 2024
Advanced interfaces for signals, data, and electrical power are essential. They help save resources and costs when networking production equipment.


To join the conversation, and become an exclusive member of Electronic Design, create an account today!