New galaxy map challenges cosmological models

Credit: Lawrence Berkeley National Lab



Updated 2 April 2025

The Hubble tension

One experimental anomaly that has bedeviled physicists and cosmologists for years is the discrepancy in values of the Hubble constant based on different experimental approaches. This discrepancy is now known as the Hubble tension.

The Hubble constant $H_0$ is a measure of the rate of expansion of the universe, and is directly connected to estimates of the age $A$ of the universe via the relation $A = 1 / H_0$. Units must be converted here, since the age of the universe is normally cited in billions of years, whereas the Hubble constant is usually given in kilometers per second per megaparsec (a parsec is $3.0857 \times 10^{13}$ km, or roughly 3.26 light-years). Also, an adjustment factor is normally applied to this formula to be in full conformance with the Big Bang model.

The trouble is that the best current experimental results for $H_0$ give conflicting values. One approach is to employ traditional astronomical techniques, typically based on observations of supernovas, Cepheid variable stars or other phenomena, combined with redshift measurements to determine the rate of recession. Another approach is based on the Lambda cold dark matter (Lambda-CDM) model of Big Bang cosmology, combined with careful measurements of cosmic microwave background (CMB) data.

New results for the Hubble constant

In an attempt to resolve the Hubble tension, numerous large international research teams have recently launched studies hoping to resolve the issue. But rather than resolve the issue, their latest results only deepen the controversy. This chart shows a selection of recent results (see this previous Math Scholar article for full data and references).

As can be seen easily from the chart, the first 17 measurements, which are mostly based on astronomical measurements of distance and redshift of supernovas, are noticeably distinct from the last six measurements, which are mostly based on analyses of the cosmic microwave background (CMB). In particular, the distance/redshift studies give an average value of 73.0, with a standard error of 1.0; the CMB-based studies give an average value of 67.5, with a standard error of 0.5. Needless to say, these two values are incompatible — either is far outside the error bars of the other. If we focus on the study with the smallest error bar from each group, namely the Type Ia supernova study (column 2) and the Planck data study (column 23), the difference is even more stark — their difference is 6.5 standard deviations, based on the Type Ia study standard error, and 11.7 standard deviations, based on the Planck study standard error.

Additional recent data and background are given in this November 2023 arXiv preprint, this November 2024 Scientific American article and this Wikipedia article. The chart above is analogous to a chart in the Scientific American article, but was independently constructed by the present author, based on data from the original papers.

Are the physical models wrong?

While each of these teams is hard at work scrutinizing their methods and refining their results, researchers are increasingly considering the unsettling possibility that one or more of the underlying physical theories are just plain wrong, at least on the length and time scales involved. Key among these theories is the Lambda-CDM model of Big Bang cosmology. Yet physicists and cosmologists are loath to discard this model, because it explains so much so well:

  • The cosmic microwave background radiation and its properties.
  • The large-scale structure and distribution of galaxies.
  • The present-day observed abundances of the light elements (hydrogen, deuterium, helium and lithium).
  • The accelerating expansion of the universe, as observed in measurements of distant galaxies and supernovas.

As Lloyd Knox, a cosmologist at the University of California, Davis, explains,

The Lambda-CDM model has been amazingly successful. … If there’s a major overhaul of the model, it’s hard to see how it wouldn’t look like a conspiracy. Somehow this ‘wrong’ model got it all right.

Various modifications to the Lambda-CDM model have been proposed, but while some of these changes partially alleviate the Hubble tension, others make it worse. None is taken very seriously in the community at the present time.

New galaxy data: Is the universe’s expansion slowing down?

One key assumption of the Lambda-CDM model is that the expansion of the universe is accelerating at a constant rate, a consequence of the lambda term of Einstein’s equations of general relativity. How solid is this assumption?

A new release of data from the Dark Energy Spectroscopic Instrument (DESI), an international collaboration of more than 900 researchers, appears to suggest that this accelerating expansion might not be constant, confounding at least 25 years of research in cosmology. In particular, DESI has mapped roughly four million stars, 13.1 million galaxies and 1.6 million quasars. This dataset contains, which is some 270 Tbyte in size, contains ten times as much data and seven times the area of the sky as a previous release. These objects range from nearby stars in the Milky Way to galaxies billions of light-years away. The instrument separates light from each object into a spectrum, which provides a key to its distance, thus permitting one to construct a detailed history of cosmic growth. See this LBNL press release for additional details.

These new results strengthen the conclusion of an earlier release of more limited data last year, namely that the data are inconsistent with the standard Lambda-CDM model. The statistical confidence level of the latest set of results is 4.2 sigma; in other words, there is only roughly one chance in 30,000 that the data is consistent with the Lambda-CDM model. In particular, dark energy appears to have weakened over the past five billion years or so.

Early dark energy?

Needless to say, the conclusion that dark energy is weakening is deeply perplexing. As mentioned above, the standard explanation of dark energy over the past 25 years is that it is a manifestation of the cosmological constant of general relativity. But if dark energy is not really constant, then perhaps the fundamental identification of dark energy with the cosmological constant of relativity might be in error.

One way or the other, non-constant dark energy begs an explanation, and theoreticians are already considering what that might mean. For example, Adam Riess (co-recipient of the 2011 Nobel Prize in physics for discovery of the accelerating universe) and Marc Kamionkowski have proposed that some unknown component of matter-energy, called early dark energy, with a density roughly 10 percent of the conventional value, existed during the early universe but later decayed away. This would bring theory into a closer match with data.

As Kamionkowski explains,

The most obvious form for early dark energy to take is a field, similar to an electromagnetic field, that fills space. This field would have added a negative-pressure energy density to space when the universe was young, with the effect of pushing against gravity and propelling space toward a faster expansion. There are two types of fields that could fit the bill. The simplest option is what’s called a slowly rolling scalar field. This field would start off with its energy density in the form of potential energy — picture it resting on top of a hill. Over time the field would roll down the hill, and its potential energy would be converted to kinetic energy. Kinetic energy wouldn’t affect the universe’s expansion the way the potential energy did, so its effects wouldn’t be observable as time went on.

Weakening dark energy and the Hubble tension

There is just one problem with the Riess-Kamionkowski theory and most other attempted explanations of weakening dark energy: They exacerbate the Hubble tension! As Wiess himself now acknowledges, “With the DESI results, I imagine many folks will be looking for an idea that can explain both late-time evolution in dark energy and the Hubble tension.”

For additional details and discussion, see this Quanta article, this Nature article, this March 2025 Scientific American article, this April 2025 Scientific American article and this New York Times article.

Caution

In spite of the temptation to throw out or substantially revise the currently understood standard model or the Lambda-CDM Big Bang cosmology, considerable caution is in order. After all, in most cases anomalies such as this are eventually resolved, usually as some defect of the experimental process or as a faulty application of the theory.

A good example of an experimental defect is the 2011 announcement by Italian scientists that neutrinos emitted at CERN (near Geneva, Switzerland) had arrived at the Gran Sasso Lab (in the Italian Alps) 60 nanoseconds sooner than if they had traveled at the speed of light. If upheld, this finding would have constituted a violation of Einstein’s theory of relativity. As it turns out, the experimental team subsequently discovered that the discrepancy was due to a loose fiber optic cable that had introduced an error in the timing system.

A good example of misapplication of underlying theory is the solar neutrino anomaly, namely a discrepancy in the number of observed neutrinos emanating from the interior of the sun from what had been predicted (incorrectly, as it turned out) based on the standard model. In 1998, researchers discovered that the anomaly could be resolved if these neutrinos have a very small but nonzero mass; then, by straightforward application of standard model, the flavor of some of these neutrinos could change enroute from the sun to the earth, thus resolving the discrepancy. Takaaki Kajita and Arthur McDonald received the 2015 Nobel Prize in physics for this discovery.

In any event, sooner or later some experimental result may be found that fundamentally and irrevocably upends some currently accepted theory, either a specific framework such as Lambda-CDM Big Bang cosmology, or even the foundational standard model of physics. Will the Hubble tension or the discovery of non-constant dark energy ultimately be the straw that breaks the camel’s back? Only time will tell.

Comments are closed.