Jan 20 2025

The Hubble Tension Hubbub

Published by under Astronomy
Comments: 0

There really is a significant mystery in the world of cosmology. This, in my opinion, is a good thing. Such mysteries point in the direction of new physics, or at least a new understanding of the universe. Resolving this mystery – called the Hubble Tension – is a major goal of cosmology. This is a scientific cliffhanger, one which will unfortunately take years or even decades to sort out. Recent studies have now made the Hubble Tension even more dramatic.

The Hubble Tension refers to discrepancies in measuring the rate of expansion of the universe using different models or techniques. We have known since 1929 that the universe is not static, but it is expanding. This was the famous discovery of Edwin Hubble who notice

d that galaxies further from Earth have a greater red-shift, meaning they are moving away from us faster. This can only be explained as an expanding universe – everything (not gravitationally bound) is moving away from everything else. This became known as Hubble’s Law, and the rate of expansion as the Hubble Constant.

Then in 1998 two teams, the Supernova Cosmology Project and the High-Z Supernova Search Team, analyzing data from Type 1a supernovae, found that the expansion rate of the universe is actually accelerating – it is faster now than in the distant past. This discovery won the Nobel Prize in physics in 2011 for  Adam Riess, Saul Perlmutter, and Brian Schmidt. The problem remains, however, that we have no idea what is causing this acceleration, or even any theory about what might have the necessary properties to cause it. This mysterious force was called “dark energy”, and instantly became the dominant form of mass-energy in the universe, making up 68-70% of the universe.

I have seen the Hubble Tension framed in two ways – it is a disconnect between our models of cosmology (what they predict) and measurements of the rate of expansion, or it is a disagreement between different methods of measuring that expansion rate. The two main methods of measuring the expansion rate are using Type 1a supernovae and by measuring the cosmic background radiation. Type 1a supernovae are considered standard candle because they have roughly the same absolute magnitude (brightness). The are white dwarf stars in a binary system that are siphoning off mass from their partner. When they reach a critical point of mass, they go supernova. So every Type 1a goes supernova with the same mass, and therefore the same brightness. If we know an object’s absolute magnitude of brightness, then we can calculate its distance. It was this data that lead to the discovery that the universe is accelerating.

But using our models of physics, we can also calculate the expansion of the universe by looking at the cosmic microwave background (CMB) radiation, which is the glow left over after the Big Bang. This gets cooler as the universe expands, and so we can calculate that expansion by looking at the CMB close to us and farther away. Here is where the Hubble Tension comes in. Using Type 1a supernovae, we calculate the Hubble Constant to be 73 km/s per megaparsec. Using the CMB the calculation is 67 km/s/Mpc. These numbers are not close enough – they are very different.

At first it was thought that perhaps the difference is due to imprecision in our measurements. As we gather more and better data (such as building a more complete sample of Type 1a supernovae), using newer and better instruments, some hoped that perhaps these two numbers would come into alignment. The opposite has happened – newer data has solidified the Hubble Tension.

A recent study, for example, uses the Dark Energy Spectroscopic Instrument (DESI) to make more precise measurements of Type 1a’s in the nearby Coma cluster. This is used to make a more precise calibration of our overall measurements of distance in the universe. With this more precise data, the authors argue that the Hubble Tension should now be considered a “Hubble Crisis” (a term which then metastasized throughout reporting headlines). The bottom line is that there really is a disconnect between theory and measurements.

Even more interesting, another group has used updated Type 1a supernovae data to argue that perhaps dark energy does not have to exist at all. This is their argument: The calculation of the Hubble Constant throughout the universe used to establish an accelerating universe is based on the assumption of isotropy and homogeneity at the scale we are observing. Isotropy means that the universe is essentially the same density no matter which direction you look in, while homogeneity means that every piece of the universe is the same as every other piece. So no matter where you are and which direction you look in, you will observe about the same density of mass and energy. This is obviously not true at small scales, like within a galaxy, so the real question is – at what scale does the universe become isotropic and homogenous? Essentially cosmologists have used the assumption of isotropy and homogeneity at the scale of the observable universe to make their calculations regarding expansion. This is called the lambda CDM model (ΛCDM), where lambda is the cosmological constant and CDM is cold dark matter.

This group, however, argues that this is not true. There are vast gaps with little matter, and matter tends to clump along filaments in the universe. If instead you take into account these variations in the density of matter throughout the universe, you get different results for the Hubble Constant. The primary reason for this is General Relativity. This is part of Einstein’s (highly verified) theory that matter affects spacetime. Where matter is dense, time relatively slows down. This means as we look out into the universe, the light that we see is travelling faster through empty space than it is through space with lots of matter, because that matter is causing time to slow down. So if you measure the expansion rate of the universes it will appear faster in gaps and slower in galaxy clusters. As the universe expands, the gaps expand, meaning the later universe will have more gaps and therefore measure a faster acceleration, while the older universe has smaller gaps and therefore measures a slower expansion. They call this the timescape model.

If the timescape model is true, then the expansion of the universe is not accelerating (it’s just an illusion of our observations and assumptions), and therefore there is no need for dark energy. They further argue that their model is a better fit for the data than ΛCDM (but not by much). We need more and better data to definitively determine which model is correct. They are also not mutually exclusive – timescape may explain some but not all of the observed acceleration, still leaving room for some dark energy.

I find this all fascinating. I will admit I am rooting for timescape. I never liked the concept of dark energy. It was always a placeholder, but also just has properties that are really counter-intuitive. For example, dark energy does not dilute as spacetime expands. This does not mean it is false – the universe can be really counterintuitive to us apes with our very narrow perspectives. I will also follow whatever the data says. But wouldn’t it be exciting if an underdog like timescape overturned a Nobel Prize winning discovery, and for at least a second time in my lifetime radically changed how we think about cosmology. Timescape may also resolve the Hubble Tension to boot.

Whatever the answer turns out to be – clearly there is something wrong with our current cosmology. Resolving this “crisis” will expand our knowledge of the universe.

No responses yet