New Math Makes Scientists More Certain About Quantum Uncertainties

Formulas and sketches related to quantum physics written on a blackboard.
Illustration: iStockphoto

By: Mark Anderson

Quantum measurements, at the core of next-generation technologies including quantum computing, quantum cryptography, and ultra-sensitive electronics, may face a new hurdle as system sensitivities brush up against Heisenberg’s Uncertainty Principle.

The practical Heisenberg limits in measuring some quantities up to the ultimate quantum sensitivity may be larger than expected—by a factor of pi. This new finding would, according to physicist Wojciech Górecki of the University of Warsaw in Poland, represent “an impediment compared to previous expectations.”

Górecki said he and his collaborators arrived at this theoretical limit by applying a branch of math known as Bayesian statistics to familiar quantum measurement problems.

The standard problem posed in many Intro to Quantum classes involves the push-and-pull conflict between measuring a particle’s position with high precision versus knowing that same particle’s momentum with high precision as well.

As Werner Heisenberg famously theorized in 1927, the product of uncertainties of these two observables can never dip below a very small number equal to Planck’s constant divided by four times pi (h/4π).

So, down at the quantum scale, there are always tradeoffs. Measuring a particle’s position with very high precision calls for sacrificing how precisely you can determine the speed and direction of its travel.

Yet, said Górecki, plenty of quantum scale measurements involve neither position nor momentum. For instance, some photonics instruments measure quantities like the phase of a wavefront versus the number of photons counted in a given energy range.

Górecki notes that canonical Heisenberg isn’t as much help here as is a related concept called the “Heisenberg limit.” The Heisenberg Limit, he says, delineates the smallest possible uncertainty in a measurement, given a set number of times a system is probed. “It is a natural consequence of Heisenberg’s uncertainty principle, interpreted in a slightly broader context,” says Górecki.

It was long believed that, with a hypothetical technology trying to discover phase as precisely as possible using only n photons, the Heisenberg Limit to the uncertainty in phase was 1/n. But no technology had been devised to prove that 1/n was the ultimate universal “Heisenberg Limit.”

There’s a good reason why. Górecki and colleagues report in a new paper in the journal Physical Review Letters that the Heisenberg Limit in this case scales as π/n instead of 1/n. In other words, the smallest measurable uncertainty is more than three times as much as previously believed. And so now we know that our observations of the universe are a little bit fuzzier than we imagined.

(To be clear, “n” here is not necessarily just the number of photons used in a measurement. It could also represent a number of other limits on the amount of resources expended in making a precision observation. The variable "n" here could also be, Górecki notes, the number of quantum gates in a measurement or the total time spent interrogating the system.)

Górecki says the new finding may not remain purely theoretical for too much longer. A 2007 experiment in precision phase measurement came within 56 percent of the new Heisenberg Limit.

“Our paper has attracted the interest of eminent researchers in the field of statistics, which find this idea worth spreading,” says Górecki. “Perhaps it would be possible to construct a simpler proof that could be included in standard textbooks.”

This article originally appeared in IEEE Spectrum on 11 February 2020.