We learned that the universe is expanding less than 100 years ago and that the expansion is accelerated less than 20 years ago. It seemed we were really starting to get the hang of it and then science threw a curve ball.
We have two methods to estimate the current rate of expansion of the universe, which is known as the Hubble Constant. Both methods have improved enormously in the last decade, and while they were once in agreement, they now differ by about 8 percent.
One method uses stars and stellar explosions to measure the distance of galaxies, which is then used to estimate the Hubble Constant. The other method uses the cosmic microwave background, the light from the Big Bang itself. The methods are entirely independent of each other and the fact that they don’t agree raises a lot of questions.
In a commentary for Nature Astronomy, Professor Wendy Freedman discussed what the current state of the field is and what we might learn in the future to solve this issue. The two measurements have some uncertainty associated with them, but even taking that into account, the two estimates still don’t overlap.
“Is the discrepancy real or is this a ‘tension in a teapot?’" Professor Freedman stated in the paper. "The obvious possibility is that one or both of the methods may suffer from unknown systematic errors."
What she discussed is that there are several possibilities. If only one of them is right, then we are either missing something about the nature of stars or about the nature of the early universe. If both of them are right, then we might have started seeing the effects of a new physical phenomenon. Alternatively, the measurements might be due to “as-yet unrecognized uncertainties,” just to stay on the skeptical side.
Both methods appear to be well thought out and tested, and they are supported by abundant experimental evidence, but clearly something is going on, otherwise we wouldn't have a debate on the value of the Hubble Constant.