Google’s Quantum Dream May Be Just Around The Corner

In 1981, Richard Feynman, probably the most famous physicist of his time asked the question: “Can we simulate physics on a computer?” At the time the answer was “theoretically yes,” but practically not at that time. Today, we may be on the verge of answering “yes” in practice to Feynman’s original question. Quantum computers operate in such a strange way and are so radically different from today’s computers that it requires some understanding of quantum mechanics and bizarre properties like “quantum entanglement.” Quantum computers are in a realm orders of magnitude beyond today’s supercomputers and their application in specific computational problems like cryptography, Big Data analysis, computational fluid dynamics (CFD), and sub-atomic physics will change our World. Canadian quantum computing company, D-Wave Systems has been at the center of Google’s efforts to pioneer this technology.


In 1981, Richard Feynman, probably the most famous physicist of his time asked the question: “Can we simulate physics on a computer?” At the time the answer was “theoretically yes,” but practically not at that time. Today, we may be on the verge of answering “yes” in practice to Feynman’s original question. Quantum computers operate in such a strange way and are so radically different from today’s computers that it requires some understanding of quantum mechanics and bizarre properties like “quantum entanglement.” Quantum computers are in a realm orders of magnitude beyond today’s supercomputers. Their application in specific computational problems like cryptography, Big Data analysis, computational fluid dynamics (CFD), and sub-atomic physics will change our World. Canadian quantum computing company, D-Wave Systems has been at the center of Google’s efforts to pioneer this technology.

Reblogged from New Scientist

Google’s Quantum Dream May Be Just Around the Corner

 QUANTUM-articleLarge-v2

31 August 2016

Revealed: Google’s plan for quantum computer supremacy

The field of quantum computing is undergoing a rapid shake-up, and engineers at Google have quietly set out a plan to dominate

SOMEWHERE in California, Google is building a device that will usher in a new era for computing. It’s a quantum computer, the largest ever made, designed to prove once and for all that machines exploiting exotic physics can outperform the world’s top supercomputers.

And New Scientist has learned it could be ready sooner than anyone expected – perhaps even by the end of next year.

The quantum computing revolution has been a long time coming. In the 1980s, theorists realised that a computer based on quantum mechanics had the potential to vastly outperform ordinary, or classical, computers at certain tasks. But building one was another matter. Only recently has a quantum computer that can beat a classical one gone from a lab curiosity to something that could actually happen. Google wants to create the first.

The firm’s plans are secretive, and Google declined to comment for this article. But researchers contacted by New Scientist all believe it is on the cusp of a breakthrough, following presentations at conferences and private meetings.

“They are definitely the world leaders now, there is no doubt about it,” says Simon Devitt at the RIKEN Center for Emergent Matter Science in Japan. “It’s Google’s to lose. If Google’s not the group that does it, then something has gone wrong.”

We have had a glimpse of Google’s intentions. Last month, its engineers quietly published a paper detailing their plans (arxiv.org/abs/1608.00263). Their goal, audaciously named quantum supremacy, is to build the first quantum computer capable of performing a task no classical computer can.

“It’s a blueprint for what they’re planning to do in the next couple of years,” says Scott Aaronson at the University of Texas at Austin, who has discussed the plans with the team.

So how will they do it? Quantum computers process data as quantum bits, or qubits. Unlike classical bits, these can store a mixture of both 0 and 1 at the same time, thanks to the principle of quantum superposition. It’s this potential that gives quantum computers the edge at certain problems, like factoring large numbers. But ordinary computers are also pretty good at such tasks. Showing quantum computers are better would require thousands of qubits, which is far beyond our current technical ability.

Instead, Google wants to claim the prize with just 50 qubits. That’s still an ambitious goal – publicly, they have only announced a 9-qubit computer – but one within reach.

“It’s Google’s to lose. If Google’s not the group that does it, then something has gone wrong“

To help it succeed, Google has brought the fight to quantum’s home turf. It is focusing on a problem that is fiendishly difficult for ordinary computers but that a quantum computer will do naturally: simulating the behaviour of a random arrangement of quantum circuits.

Any small variation in the input into those quantum circuits can produce a massively different output, so it’s difficult for the classical computer to cheat with approximations to simplify the problem. “They’re doing a quantum version of chaos,” says Devitt. “The output is essentially random, so you have to compute everything.”

To push classical computing to the limit, Google turned to Edison, one of the most advanced supercomputers in the world, housed at the US National Energy Research Scientific Computing Center. Google had it simulate the behaviour of quantum circuits on increasingly larger grids of qubits, up to a 6 × 7 grid of 42 qubits.

This computation is difficult because as the grid size increases, the amount of memory needed to store everything balloons rapidly. A 6 × 4 grid needed just 268 megabytes, less than found in your average smartphone. The 6 × 7 grid demanded 70 terabytes, roughly 10,000 times that of a high-end PC.

Google stopped there because going to the next size up is currently impossible: a 48-qubit grid would require 2.252 petabytes of memory, almost double that of the top supercomputer in the world. If Google can solve the problem with a 50-qubit quantum computer, it will have beaten every other computer in existence.

Eyes on the prize

By setting out this clear test, Google hopes to avoid the problems that have plagued previous claims of quantum computers outperforming ordinary ones – including some made by Google.

Last year, the firm announced it had solved certain problems 100 million times faster than a classical computer by using a D-Wave quantum computer, a commercially available device with a controversial history. Experts immediately dismissed the results, saying they weren’t a fair comparison.

Google purchased its D-Wave computer in 2013 to figure out whether it could be used to improve search results and artificial intelligence. The following year, the firm hired John Martinis at the University of California, Santa Barbara, to design its own superconducting qubits. “His qubits are way higher quality,” says Aaronson.

It’s Martinis and colleagues who are now attempting to achieve quantum supremacy with 50 qubits, and many believe they will get there soon. “I think this is achievable within two or three years,” says Matthias Troyer at the Swiss Federal Institute of Technology in Zurich. “They’ve showed concrete steps on how they will do it.”

Martinis and colleagues have discussed a number of timelines for reaching this milestone, says Devitt. The earliest is by the end of this year, but that is unlikely. “I’m going to be optimistic and say maybe at the end of next year,” he says. “If they get it done even within the next five years, that will be a tremendous leap forward.”

The first successful quantum supremacy experiment won’t give us computers capable of solving any problem imaginable – based on current theory, those will need to be much larger machines. But having a working, small computer could drive innovation, or augment existing computers, making it the start of a new era.

Aaronson compares it to the first self-sustaining nuclear reaction, achieved by the Manhattan project in Chicago in 1942. “It might be a thing that causes people to say, if we want a full-scalable quantum computer, let’s talk numbers: how many billions of dollars?” he says.

Solving the challenges of building a 50-qubit device will prepare Google to construct something bigger. “It’s absolutely progress to building a fully scalable machine,” says Ian Walmsley at the University of Oxford.

For quantum computers to be truly useful in the long run, we will also need robust quantum error correction, a technique to mitigate the fragility of quantum states. Martinis and others are already working on this, but it will take longer than achieving quantum supremacy.

Still, achieving supremacy won’t be dismissed.

“Once a system hits quantum supremacy and is showing clear scale-up behaviour, it will be a flare in the sky to the private sector,” says Devitt. “It’s ready to move out of the labs.”

“The field is moving much faster than expected,” says Troyer. “It’s time to move quantum computing from science to engineering and really build devices.”

BIG IDEAS: Physics At The Crossroads

This is another in my occasional series on Big Ideas. Last night I had my first opportunity to watch Particle Fever, the acclaimed 2014 documentary on the Large Hadron Collider (LHC) and the discovery of the Higgs Boson particle. This followed my reading of a much more recent New York Times Op-Ed, describing a crisis in physics resulting from the discovery of the Higgs Boson. Essentially, the science of physics has no ability any time in the foreseeable future to experimentally go beyond the Higgs Boson. Physics is unlikely to be able to find The Holy Grail: a unifying Theory of Everything tying Einstein and the Higgs Boson into one simple elegant explanation.


This is another in my occasional series on Big Ideas.  Last night I had my first opportunity to watch Particle Fever, the acclaimed 2014 documentary on the Large Hadron Collider (LHC) and the discovery of the Higgs Boson particle.  I recommend it to everyone. This followed my reading of a much more recent New York Times Op-Ed this week, describing a crisis in physics resulting from the discovery of the Higgs Boson.  Essentially, the science of physics has no ability any time in the foreseeable future to experimentally go beyond the Higgs Boson.  Physics is unlikely to be able to find The Holy Grail: a unifying Theory of Everything tying Einstein and the Higgs Boson into one simple elegant explanation.

A debate has erupted among physicists around the World, regarding the fundamental scientific imperative to empirically verify theories through experiments like those at LHC. But with the scale and complexity of the experiments required outstripping human capability, the question is being raised, “Can we explicitly set aside the need for experimental confirmation of today’s most ambitious cosmic theories — so long as those theories are “sufficiently elegant and explanatory.” The emergence of this debate can clearly be seen in the Particle Fever interviews with various LHC physicists.  While I do understand the quandary we have, my fear is that science could potentially descend into competing belief systems, and give comfort to religious groups who believe the Earth is only 6000 years old. That would be an even greater catastrophe. Any comments or thoughts on this?

Particle Fever, the 2014 award winning documentary on the Large Hadron Collider and the discovery of the Higgs Boson particle. 

READ MORE: NY Times: Crisis At The Edge of Physics

You may think that the answer is an obvious yes, experimental confirmation being the very heart of science. But a growing controversy at the frontiers of physics and cosmology suggests that the situation is not so simple.

A few months ago in the journal Nature, two leading researchers, George Ellis and Joseph Silk, published a controversial piece called “Scientific Method: Defend the Integrity of Physics.” They criticized a newfound willingness among some scientists to explicitly set aside the need for experimental confirmation of today’s most ambitious cosmic theories — so long as those theories are “sufficiently elegant and explanatory.” Despite working at the cutting edge of knowledge, such scientists are, for Professors Ellis and Silk, “breaking with centuries of philosophical tradition of defining scientific knowledge as empirical.”

Whether or not you agree with them, the professors have identified a mounting concern in fundamental physics: Today, our most ambitious science can seem at odds with the empirical methodology that has historically given the field its credibility.

How did we get to this impasse? In a way, the landmark detection three years ago of the elusiveHiggs boson particle by researchers at the Large Hadron Collider marked the end of an era. Predicted about 50 years ago, the Higgs particle is the linchpin of what physicists call the “standard model” of particle physics, a powerful mathematical theory that accounts for all the fundamental entities in the quantum world (quarks and leptons) and all the known forces acting between them (gravity, electromagnetism and the strong and weak nuclear forces).

But the standard model, despite the glory of its vindication, is also a dead end. It offers no path forward to unite its vision of nature’s tiny building blocks with the other great edifice of 20th-century physics: Einstein’s cosmic-scale description of gravity. Without a unification of these two theories — a so-called theory of quantum gravity — we have no idea why our universe is made up of just these particles, forces and properties. (We also can’t know how to truly understand the Big Bang, the cosmic event that marked the beginning of time.)

This is where the specter of an evidence-independent science arises. For most of the last half-century, physicists have struggled to move beyond the standard model to reach the ultimate goal of uniting gravity and the quantum world. Many tantalizing possibilities (like the often-discussed string theory) have been explored, but so far with no concrete success in terms of experimental validation.

Today, the favored theory for the next step beyond the standard model is called supersymmetry (which is also the basis for string theory). Supersymmetry predicts the existence of a “partner” particle for every particle that we currently know. It doubles the number of elementary particles of matter in nature. The theory is elegant mathematically, and the particles whose existence it predicts might also explain the universe’s unaccounted-for “dark matter.” As a result, many researchers were confident that supersymmetry would be experimentally validated soon after the Large Hadron Collider became operational.

But many won’t. Some may choose instead to simply retune their models to predict supersymmetric particles at masses beyond the reach of the Large Hadron Collider’s power of detection — and that of any foreseeable substitute.

Implicit in such a maneuver is a philosophical question: How are we to determine whether a theory is true if it cannot be validated experimentally? Should we abandon it just because, at a given level of technological capacity, empirical support might be impossible? If not, how long should we wait for such experimental machinery before moving on: ten years? Fifty years? Centuries?

Consider, likewise, the cutting-edge theory in physics that suggests that our universe is just one universe in a profusion of separate universes that make up the so-called multiverse. This theory could help solve some deep scientific conundrums about our own universe (such as the so-called fine-tuning problem), but at considerable cost: Namely, the additional universes of the multiverse would lie beyond our powers of observation and could never be directly investigated. Multiverse advocates argue nonetheless that we should keep exploring the idea — and search for indirect evidence of other universes.

The opposing camp, in response, has its own questions. If a theory successfully explains what we can detect but does so by positing entities that we can’t detect (like other universes or the hyperdimensional superstrings of string theory) then what is the status of these posited entities? Should we consider them as real as the verified particles of the standard model? How are scientific claims about them any different from any other untestable — but useful — explanations of reality?

Recall the epicycles, the imaginary circles that Ptolemy used and formalized around A.D. 150 to describe the motions of planets. Although Ptolemy had no evidence for their existence, epicycles successfully explained what the ancients could see in the night sky, so they were accepted as real. But they were eventually shown to be a fiction, more than 1,500 years later. Are superstrings and the multiverse, painstakingly theorized by hundreds of brilliant scientists, anything more than modern-day epicycles?

Just a few days ago, scientists restarted investigations with the Large Hadron Collider, after a two-year hiatus. Upgrades have made it even more powerful, and physicists are eager to explore the properties of the Higgs particle in greater detail. If the upgraded collider does discover supersymmetric particles, it will be an astonishing triumph of modern physics. But if nothing is found, our next steps may prove to be difficult and controversial, challenging not just how we do science but what it means to do science at all.

Quantum tech is more than just crazy science: It’s good business from mobile payments to fighting the NSA,

Management students may ask why the title of this post claims that quantum technology is good business. So let me try to explain, and then read on to the PandoDaily post by David Holmes. The bottom line is that some basic understanding of quantum mechanics is going to be a valuable management skill going forward. Why? Read on


Management students may ask why the title of this post claims that quantum technology is good business. So let me try to explain, and then read on to the PandoDaily post by David Holmes. The bottom line is that some basic understanding of quantum mechanics is going to be a valuable management skill going forward. Why? Read on

Yesterday, National Public Radio in the United States (which can be heard online) broadcast a fascinating discussion about Monday’s announcement of the long awaited breakthrough of proving the existence of gravitational waves which include the fingerprint of the original Big Bang.  Featuring legendary astrophysicist Leonard Susskind of Stanford and a number of other leading physicists, the discussion inevitably drifted to quantum mechanics, and the original Big Bang itself, which Stanford Physics Professor. Chao-Lin Kuo, described as “mind scrambling.”  Quantum entanglement is another area that defies common sense: particles that mimic each other and change faster than the speed of light, which should be impossible.  Einstein’s famous quote, “God does not play dice,” was his reaction to the non-deterministic nature of quantum events and theory, which also violate his general theory of relativity. It turns out the random nature of quantum mechanics provides a superior solution for hideously complex problems, finding the best “probabilistic” solutions. Quantum mechanics is also providing a potential way forward in encryption and privacy.

Read and listen on NPR: Scientists Announce Big Bang Breakthrough

However, all of this “mind scrambling” pure science is rapidly becoming applied science: science becoming useful technological innovation and applied to economic activity.  Some of my students may recall our discussions of Moore’s Law in semiconductor design. As  Moore’s Law reaches it finite limit, quantum “technology” is creating one path forward, and providing new solutions to Internet security and supercomputing.  David Holme’s PandoDaily article today attempts to explain in greater detail why this is important for business. 

Vern Brownell, CEO of D-Wave Systems has written an excellent explanation in layman’s terms, of the importance of quantum computing, and how it differs from “deterministic” computing.

Read more:  Solving the unsolvable: a quantum boost for supercomputing

Best of all there is an excellent book for those willing to devote the time and grey matter to quantum physics, “Quantum physics, a beginners’ guide,” by Alistair Rae, available in paperback on Amazon or Kindle e-book.

quantum physics