How fast is the universe expanding? New results deepen the controversy

A large cluster galaxy in the center acts as a gravitational lens, splitting the light from a more distant supernova into four yellow images (arrows)

The standard model of physics has reigned supreme since the 1970s, successfully describing experimental physical reality in a vast array of experimental tests. Among other things, the standard model predicted the existence of a particle, now known as the Higgs boson, underlying the phenomenon of mass. This particle was experimentally discovered in 2012, nearly 50 years after it was first predicted.

Yet physicists have known for many years that the standard model cannot be the final answer. Most notably, quantum theory on one hand and general relativity on the other are known to be mathematically incompatible. This has led to research in string theory and loop quantum gravity as potential frameworks to resolve this incompatibility. Other difficulties may exist as well.

So how can physics advance beyond the standard model? There is only so far that mathematical theories can be taken in the absence of solid experimental results. As Sabine Hossenfelder has emphasized, beautiful mathematics published in a vacuum of experimental data can actually lead physics astray.

Three nagging anomalies

In a previous Math Scholar article, we described several anomalies that have arisen in recent physics experiments, any of which may potentially be a spark that leads to new physics beyond the standard model. Here are three:

  • The proton radius anomaly: This stems from the fact that careful measurements of the radius of a proton’s radius when orbited by an electron yield a radius of approximately 0.877 femtometers (i.e., 0.877 x 10-15 meters), whereas separate measurements of the proton’s radius when it is coupled with a muon (“muonic hydrogen”) yield a radius of 0.84 femtometers. These measurements differ by significantly more than the error bars of the two sets of experiments. See this Quanta article for details.
  • The neutron lifetime anomaly: This stems from the fact that “bottle” measurements of a neutron’s average lifetime yield 879.3 seconds, whereas “beam” measurements yield 888 seconds. The error bar of the bottle measurements is just 0.75 seconds, and that of the beam measurements is just 2.1 seconds, so again the two measurements appear to be further apart than can reasonably be explained as statistical error. See this Quanta article for additional details.

  • The Hubble constant anomaly: The Hubble constant H0 is a measure of the rate of expansion of the universe. One method to determine H0 is based on the flat Lambda cold dark matter (Lambda-CDM) model of the universe, combined with careful measurements of the cosmic microwave background (CMB) data from the Planck satellite. The latest (2018) result from the Planck team yielded H0 = 67.4, plus or minus 0.5 (the units are kilometers per second per megaparsec). Another approach is to employ a more traditional astronomical technique, based on observations of Cepheid variable stars, combined with parallax measurements as a calibration. In 2016, a team of astronomers using the Wide Field Camera 3 (WFC3) of the Hubble Space Telescope obtained the value H0 = 73.24, plus or minus 1.74. Again, these two values differ by significantly more than the combined error bars of the two measurements.

For each of these anomalies, experimental teams on both sides have been attempting to reduce error bars and to explore the fundamental theory to see if there is any heretofore ignored possibility of error.

New results for the Hubble constant anomaly

In the past few months (as of August 2019), several new experimental studies have been published on the Hubble constant. In March 2019, a research team working with the Hubble Space Telescope reported that based on observations of 70 long-period Cepheid variable stars in the Large Magellanic Cloud, they were able to refine their estimate to H0 = 74.03, plus or minus 1.42. Needless to say, this new result does not help to resolve the discrepancy — it moves in the other direction.

In July 2019, a group reported results from another experimental approach, known as the “Tip of the Red Giant Branch” (TRGB). Their approach, which is analogous to but independent from the approach taken with Cepheid variable stars, is to analyze a surge in helium burning near the end of a red giant star’s lifetime. Using this scheme, they reported H0 = 69.8, plus or minus 1.7. This is slightly more than the Planck team value (67.8), but not nearly enough to close the gap with the Cepheid approach.

A third group also announced results in July 2019. This project, called H0 Lenses in COSMOGRAIL’s Wellspring (HoLiCOW) [yes, that is the acronym], employs gravitational lensing, namely the phenomenon predicted by general relativity that light bends as it passes near an intervening star or galaxy (see graphic above). The specific approach of the HoLiCOW project is to measure light from a very distant quasar, which is lensed by a closer galaxy. When this happens, multiple time-delayed images of the galaxy appear at the edges of the intervening galaxy, when viewed by earth-bound astronomers. The HoLiCOW project’s latest result is H0 = 73.3, plus or minus 1.76.

Are the physical models wrong?

Needless to say, researchers are perplexed by the latest reports: the Planck team (based on the Lambda-CDM model) reports H0 = 67.4 (plus or minus 0.5); the TRGB team reports 69.8 (plus or minus 1.7); the HoLiCOW team reports 73.3 (plus or minus 1.76); and the Cepheid team reports 74.03 (plus or minus 1.42). Obviously these results cannot all simultaneously be correct. For example, the HoLiCOW team’s figure (73.3) represents a 5.3 sigma discrepancy from the Planck figure (67.4). While each of these teams is hard at work scrutinizing their methods and refining their results, there is an unsettling possibility that one or more of the underlying physical theories are just plain wrong, at least on the length and time scales involved.

Key among these theories is the Lambda-CDM model of big bang cosmology. Yet physicists and cosmologists are loath to discard this model, because it explains so much so well:

  • The cosmic microwave background radiation and its properties.
  • The large-scale structure and distribution of galaxies.
  • The present-day observed abundances of the light elements (hydrogen, deuterium, helium and lithium).
  • The accelerating expansion of the universe, as observed in measurements of distant galaxies and supernovas.

As Lloyd Knox, a cosmologist at the University of California, Davis, explains,

The Lambda-CDM model has been amazingly successful. … If there’s a major overhaul of the model, it’s hard to see how it wouldn’t look like a conspiracy. Somehow this ‘wrong’ model got it all right.

Various modifications to the Lambda-CDM model have been proposed, but while some of these changes partially alleviate the Hubble constant discrepancy, others make it worse. None are taken very seriously in the community at the present time.

For additional details and discussion, see this Scientific American article (although note that the Scientific American article’s report on the HoLiCOW measurement does not agree with the HoLiCOW team’s latest technical paper).


In spite of the temptation to jump to conclusions, throwing out the standard model or big bang cosmology, considerable caution is in order. After all, as mentioned above, in most cases anomalies are eventually resolved, usually as some defect of the experimental process or as a faulty application of the theory.

A good example of an experimental defect is the 2011 announcement by Italian scientists that neutrinos emitted at CERN (near Geneva, Switzerland) had arrived at the Gran Sasso Lab (in the Italian Alps) 60 nanoseconds sooner than if they had traveled at the speed of light. If upheld, this finding would have constituted a violation of Einstein’s theory of relativity. As it turns out, the experimental team subsequently discovered that the discrepancy was due to a loose fiber optic cable that had introduced an error in the timing system.

A good example of misapplication of underlying theory is the solar neutrino anomaly, namely a discrepancy in the number of observed neutrinos emanating from the interior of the sun from what had been predicted (incorrectly, as it turned out) based on the standard model. In 1998, researchers discovered that the anomaly could be resolved if neutrinos have a very small but nonzero mass; then, by straightforward application of standard model, the flavor of neutrinos could change enroute from the sun to the earth, thus resolving the discrepancy. Takaaki Kajita and Arthur McDonald received the 2015 Nobel Prize in physics for this discovery.

In any event, sooner or later some experimental result may be found that fundamentally upsets currently accepted theoretical theories, either for a specific framework such as Lambda-CDM big bang cosmology, or even for the foundational standard model. Are any of the above-mentioned anomalies of this earth-shaking character? Only time will tell.

Comments are closed.