Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Space NASA

New JWST Data Explores 'Hubble Constant' Tension for Universe's Expansion Rate (space.com) 59

"Scientists can't agree on the exact rate of expansion of the universe, dictated by the Hubble constant," a new article at Space.com reminds us: The rate can be measured starting from the local (and therefore recent) universe, then going farther back in time — or, it can be calculated starting from the distant (and therefore early) universe, then working your way up. The issue is both methods deliver values that don't agree with each other. This is where the James Web Space Telescope (JWST) comes in. Gravitationally lensed supernovas in the early cosmos the JWST is observing could provide a third way of measuring the rate, potentially helping resolve this "Hubble trouble." "The supernova was named 'supernova Hope' since it gives astronomers hope to better understand the universe's changing expansion rate," Brenda Frye, study team leader and a University of Arizona researcher, said in a NASA statement.

This investigation of supernova Hope began when Frye and her global team of scientists found three curious points of light in a JWST image of a distant, densely packed cluster of galaxies. Those points of light in the image were not visible when the Hubble Space Telescope imaged the same cluster, known as PLCK G165.7+67.0 or, more simply, G165, back in 2015. "It all started with one question by the team: 'What are those three dots that weren't there before? Could that be a supernova?'" Frye said.

The team noted a "high rate of star formation... more than 300 solar masses per year," according to NASA's statement: Dr. Frye: "Initial analyses confirmed that these dots corresponded to an exploding star, one with rare qualities. First, it's a Type Ia supernova, an explosion of a white dwarf star. This type of supernova is generally called a 'standard candle,' meaning that the supernova had a known intrinsic brightness. Second, it is gravitationally lensed. Gravitational lensing is important to this experiment. The lens, consisting of a cluster of galaxies that is situated between the supernova and us, bends the supernova's light into multiple images...

To achieve three images, the light traveled along three different paths. Since each path had a different length, and light traveled at the same speed, the supernova was imaged in this Webb observation at three different times during its explosion... Trifold supernova images are special: The time delays, supernova distance, and gravitational lensing properties yield a value for the Hubble constant... The team reports the value for the Hubble constant as 75.4 kilometers per second per megaparsec, plus 8.1 or minus 5.5... This is only the second measurement of the Hubble constant by this method, and the first time using a standard candle.

Their result? "The Hubble constant value matches other measurements in the local universe, and is somewhat in tension with values obtained when the universe was young."
This discussion has been archived. No new comments can be posted.

New JWST Data Explores 'Hubble Constant' Tension for Universe's Expansion Rate

Comments Filter:
  • The obvious conclusion is that the "Hubble constant" is not a constant. There is no particular reason other than choosing the simplest model that the rate of the expansion of the universe should be constant.
    • Re:Constant (Score:5, Informative)

      by Anonymous Coward on Sunday October 06, 2024 @12:37PM (#64844065)

      Constant in space, not time. Even the most simple cosmological models currently considered don't think we live in a de Sitter universe.

      The Friedmann equation is Hubble^2 = (constants) * energy density + Possible spatial curvature/a^2

      Where Hubble = 1/a da/dt and a is the scale factor or "size" of the universe (or representative cell therein).

      And if it's not constant in space (or has different rates of expansion in different directions) then we have to rethink the cosmological principles (homogeneity and isotropy on large scales).

      As the universe expands, the energy density reduces, so the Hubble parameter goes down. Eventually at the universe becomes void of matter it will tend to a constant set by the cosmological constant (we think). But it has changed very much over the history of the universe.

      The tension here is that if we take the current observed value and track it back to early times, the observed value doesn't quite match up with the values we'd get if we just took the earliest observations. There's a lot of possible reasons for this, but don't think that we haven't considered "it's not a constant" - we have. Extensively.

      • Constant in space, not time. Even the most simple cosmological models currently considered don't think we live in a de Sitter universe.

        The Friedmann equation is Hubble^2 = (constants) * energy density + Possible spatial curvature/a^2

        Where Hubble = 1/a da/dt and a is the scale factor or "size" of the universe (or representative cell therein).

        And if it's not constant in space (or has different rates of expansion in different directions) then we have to rethink the cosmological principles (homogeneity and isotropy on large scales).

        As the universe expands, the energy density reduces, so the Hubble parameter goes down. Eventually at the universe becomes void of matter it will tend to a constant set by the cosmological constant (we think). But it has changed very much over the history of the universe.

        The tension here is that if we take the current observed value and track it back to early times, the observed value doesn't quite match up with the values we'd get if we just took the earliest observations. There's a lot of possible reasons for this, but don't think that we haven't considered "it's not a constant" - we have. Extensively.

        Another way to view it is to think in terms of construction.

        Assume the universe is computable, that means it can be simulated using a computer program.

        The program has to represent space in some way, and regardless of the implementation (or compression algorithm) space can be considered a large 3D array of points, possible positions in the universe.

        If expansion is homogenous, then all of space is expanding all the time. Everywhere you look you see everything expanding away from you, like the surface of an ex

        • by Bumbul ( 7920730 )

          That's also a pretty simple, perhaps the simplest explanation. The rate of expansion is a secondary effect from the rate of generating new locations in space, the generation rate is fixed and unchanging, but as a result the visible expansion seems to be slowing down.

          You put much effort into this long post. Unfortunately your thinking process goes wrong from this very first assumption of yours. The current evidence from standard candles points to the fact that the rate of expansion is speeding up, not slowing down.

          • And your model? (Score:4, Interesting)

            by Okian Warrior ( 537106 ) on Sunday October 06, 2024 @05:09PM (#64844483) Homepage Journal

            That's also a pretty simple, perhaps the simplest explanation. The rate of expansion is a secondary effect from the rate of generating new locations in space, the generation rate is fixed and unchanging, but as a result the visible expansion seems to be slowing down.

            You put much effort into this long post. Unfortunately your thinking process goes wrong from this very first assumption of yours. The current evidence from standard candles points to the fact that the rate of expansion is speeding up, not slowing down.

            My very first assumption was indeed wrong, the universe is not computable, but I did that on purpose to make for a simpler post because that point is irrelevant. Not being computable can arise from several factors, and these factors can be dealt with in the overall model to make testable predictions.

            That the Hubble constant is increasing is not in any way a fact pointed to by the evidence; in fact the evidence [cloudfront.net] is right now uncertain; or perhaps conditioned on the year of publication. The Hubble constant takes various values depending on which year the paper was published.

            Here's [skyandtelescope.org] a good overview from 2019, in case you're interested.

            I'd be interested to hear your constructive description of the universe. Since you knew my thinking went wrong from the first, I assume you know which parts of the universe are not computable.

            My model deals with the non-computable bits handily. How does your model deal with this?

            Or are you just interested in the global measurement and not, for example, how that value comes about in our universe?

            • by Bumbul ( 7920730 )
              Since you asked, I'm into VSL hypothesis - i.e. speed of light varies over time, hence resulting in the apparent result of expansion speeding up. The REASON for the change is then another matter. One candidate could be e.g. Dynamic Universe: https://www.physicsfoundations... [physicsfoundations.org]
        • Maybe the universe is a hyper-balloon-animal. The twists are dark matter.
      • ãSdon't think that we haven't considered "it's not a constant" - we have. Extensively.ã

        How much hubris is involved in keeping cherished assumptions such as homogeneity and isotropy on large scales?

      • by HiThere ( 15173 )

        We know it's not constant in space because gravity prevents that. The local group is gravitationally bound, and is not expanding. Get beyond that and in at least some directions you see expansion.

        OTOH, this probably isn't a large enough effect to explain the tension. (I'm no expert in the field, and not even extremely interested, so correct me if I'm wrong.)

        IIUC, the Hubble "constant" is intended to be an average rate of expansion. It's always seemed to me that for it to be constant over time would requ

    • Re:Constant (Score:5, Informative)

      by Baron_Yam ( 643147 ) on Sunday October 06, 2024 @01:23PM (#64844163)

      1) We already know it isn't necessary under the current model for it to be constant. The Inflationary Epoch is worth reading up on.

      2) The issue isn't that it might be variable, the issue is we have two presumed reliable methods for measuring it, and we're getting two different answers whose margins of error do not overlap.

      • The two measurements are at different distance ranges and consequently at different times in the past. As we look at the universe the further away, we look the further into the past we are looking.
  • Halton Arp demonstrated over and over that young galaxies have different wavelengths than older parent galaxies that are physically connected. As if one is receding and the other is approaching. His x-ray photographs show these galaxies to be physically connected yet one has a red-shift and the other blue-shift. Indicating that these frequencies are intrinsic, and not a valid indicator of relative movement (or distance).

    Arp's work clearly falsifies the notion that redshift is a direct indicator of univ

  • Do some parts of the universe expand faster than other parts?
    How do we know that the expansion is constant?

Adding features does not necessarily increase functionality -- it just makes the manuals thicker.

Working...