Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Space Science

Is Space-Time Quantized Or Analog? (space.com) 148

"What are the implications if 'space-time' (as conceived of in the Einstein Theory of General Relativity) is quantized like all other aspects of matter and energy?" asks Slashdot reader sixoh1. Space.com reports of a new study that tried to find out: In order for the math of general relativity to work, this fabric of space-time has to be absolutely smooth at the tiniest of scales. No matter how far you zoom in, space-time will always be as wrinkle-free as a recently ironed shirt. No holes, no tears, no tangles. Just pure, clean smoothness. Without this smoothness, the mathematics of gravity simply break down. But general relativity isn't the only thing telling us about space-time. We also have quantum mechanics (and its successor, quantum field theory). In the quantum world, everything microscopic is ruled by random chance and probabilities. Particles can appear and disappear at a moment's notice (and usually even less time than that). Fields can wiggle and vibrate with a will all their own. And nothing can ever be known for certain. [...]

That's exactly what a team of astronomers did, submitting their results for publication in the Monthly Notices of the Royal Astronomical Society, and also posting their work to the online preprint site arXiv. And in a perfect coincidence, they searched for the frothiness of space-time using ... espresso. No, not the drink. ESPRESSO, the Echelle Spectrograph for Rocky Exoplanet and Stable Spectroscopic Observations, an instrument based at the European Southern Observatory's Very Large Telescope. As its name suggests, ESPRESSO was not designed to search for space-time frothiness, but it turned out to be the best tool for the job. And the astronomers pointed it at a perfect source: a run-of-the-mill gas cloud sitting over 18 billion light-years away. What makes this particular gas cloud especially useful is two facts. One, there is a bright source sitting just behind it, illuminating it. And two, there's iron in the cloud, which absorbs the background light at a very specific wavelength.

So from our vantage point on Earth, if space-time is perfectly smooth, that gap in the background light caused by the gas cloud should be just as narrow as if the cloud was sitting right next to us. But if space-time is frothy, then the light traveling over the billions of light-years will spread out, changing the width of the gap. The astronomers didn't find any hint of frothiness, which doesn't mean that it doesn't exist -- it just means that if space-time is frothy, we need more than 18 billion light-years to see it with our current technology. But the results were able to rule out some models of quantum gravity, sending them into the proverbial dustbin of physics history.

This discussion has been archived. No new comments can be posted.

Is Space-Time Quantized Or Analog?

Comments Filter:
  • The entire cosmos is running in God's big toenail.
    • Space-time is a nth dimensional representation of a n+1th dimensional being. In other words, God's nude selfie.
    • by Joce640k ( 829181 ) on Friday May 01, 2020 @05:04AM (#60010120) Homepage

      "as wrinkle-free as a recently ironed shirt"

      That has to be one of the worst analogies ever.

    • by jellomizer ( 103300 ) on Friday May 01, 2020 @08:47AM (#60010432)

      I find it amazing and troublesome that science in some fields have grown beyond direct observation. We are at a point now where we have a mathematical formula. Using such formula we should expect to see some artifacts. So we go looking for the artifacts.

      It is Amazing that using this method we can create a much better knowledge of the universe even to parts that we will never experience.
      It is also troublesome because it is getting more and more complex to explain to the non-expert in the field. Using vague terms like "frothiness", so the non-experts in that field either need to trust their statements, or just not believe them.

      I fell this is part of the problem with science deniers today. Much of the science is so abstract and not relatable that people just don't believe it. It is like Global Warming, while I am in my location with a blanket because this month is unseasonably cold. However the Math shows global temperatures are rising, but the Earth is so big compared to me, it is difficult to see. It is easier to worry about Global Warming when we have a temperate winter or a hot summer.
      However this miss trust is causing people to double down further and further. So we have nuts like the flat Earthers, Anti-vaxers. who avoid science and outwardly reject it. Because modern science is difficult to observe now.

      • by thereddaikon ( 5795246 ) on Friday May 01, 2020 @10:39AM (#60010836)

        I wrote a rant then deleted it. I decided it was too wordy so I'll try to be more concise this time. Yes anti-science is a problem today but academia is not doing itself any favors in the PR department. And I would go so far as to say that they deserve a good portion of the blame for why people have turned against them.

        Research needs to be published more openly. Anything funded with government dollars should have the paper in the public domain. Nobody should have to pay a subscription so see what their tax dollars are getting them.

        Academia needs to stop talking down to people from their ivory towers and then get offended when they are misunderstood. Theory has a very precise meaning in science, but in general English it means little more than a decent guess. We have a term for that, its called jargon. And rule #1 of jargon is you don't use it when explaining something to users. Only assholes do that to sound smart. So when most people most of the time use one definition for theory, don't go on CNN and use your industry specific definition. Speak like a normal person. You'll get farther.

        The current business model of research is also broken. The current system in place incentivises positive results and disincentivises null and negative results. For reasons that should be obvious, that's bad. Not only does it effect what leads teams will choose to research, but the publish or die mentality has turned academia into a toxic and cuthroat place. It's also caused a minor crisis of junk research. Not only do you have garbage tier journals which will publish the academic equivalent to excrement but a shocking number of heavily cited papers are not reproducible and we are seeing a rise in the number of out right faked and fraudulent test results. The peer review system which was put in place to prevent these issues is seemingly ineffective. Again, because the model is broken. The reviewers don't have it as a full time job. Often its not even paid. They have their own research to attend to and lack the time and resources to verify and replicate the complex experiments. So their stamp amounts to little more than a proof read for major logical or writing flaws. Trust that the researcher submitting the work is on the up and up is implicit.

        I know this is a problem because I had the displeasure of seizing the equipment of an academic and their assistants who had been found to publish intentionally falsified experimental data. It wasn't found until long after the fact. If the peer review process were functional instead of dysfunctional it would have caught it in the submission. If the publish or die mentality didn't exist then they likely wouldn't have attempted it in the first place.

        While outright fraud in research is rare, the examples that do happen tend to be well reported and it gives science deniers ammunition. Combine that with the above issues of opacity and its no wonder we went from championing the successes of science in one generation to calling it fake news.

        • by lgw ( 121541 ) on Friday May 01, 2020 @02:25PM (#60011810) Journal

          It wasn't found until long after the fact. If the peer review process were functional instead of dysfunctional it would have caught it in the submission.

          Peer review isn't there to catch fraud, it's to catch honest mistakes in process. Assuming they described good methodology in their papers, but were lying, peer review isn't supposed to catch that. Peer review is just the filter that says "it's worth a wider audience looking at this". The part where someone in the wider audience might try to reproduce the results is after that.

          The problem is, as you say earlier, that's there's no incentive to reproduce results. If a paper is controversial, then sure, basic human psychology takes care of it, but that's rare. It seems the majority of published biochem synthesis results are fake. That went from a few people faking it to make their quotas, to most people faking results to keep up, to no one daring to question the other guy's work for fear of their own being questioned. Same thing in psychology and sociology, where blatantly bad statistical methodologies are used to make it appear there's something worth publishing, but no one gets called out because almost everyone is doing the same thing.

          I guess some fields have gotten so bad it is a peer review failure, as their not even lying convincingly in the papers any more, in an "emperor's new clothes" sort of way.

          While outright fraud in research is rare,

          That's sadly not true in some fields, as I mentioned above. Just not ones that are currently in the anti-science debate. We'll certainly be an cautionary tale told by science historians centuries from now: "here's how an entire field can become a sham, which is why we do X, Y, and Z now".

      • Re: (Score:3, Interesting)

        by ceoyoyo ( 59147 )

        That happened at least as long ago as Archimedes sticking some posts in the ground and measuring the circumference of the planet.

        One of the world's favourite hobbies is to sit in a chair and yell advice to atheletes on TV. The athelete is much superior to the couch potato both physically and in ability to play the game. Yet the couch potatos yell advice anyway. And no matter how well the athlete performs, you will always find some couch potatoes who think they're shite.

        Science deniers, pseudoscience enthusi

      • I agree. The problem is that a great deal of science now requires a great deal of trust in the 'experts' who as often as not have their own biases and argue with one another. Sometimes legitimately and sometimes not. The secondary problem is that there is a portion of the scientific population who seem to insist that the only legitimate means of knowledge is the one they control, but that is neither how human beings experience the world , nor it really even true. As a someone classic example, suppose I w

      • What makes you think they deny it because they don't believe or insufficiently understand the science. I think they have other motivations in many cases.
    • What tests can we make to prove your hypothesis?

    • I believe you meant, right pinkie fingernail. The big toenail hypothesis was disproven by Schwartz and Heistein in 1898.
  • by hcs_$reboot ( 1536101 ) on Friday May 01, 2020 @04:12AM (#60010042)
    both
  • by pieisgood ( 841871 ) on Friday May 01, 2020 @04:27AM (#60010062) Journal

    Continuous. Analog? what the fuck is this? A vinyl record?

    • by msauve ( 701917 ) on Friday May 01, 2020 @06:11AM (#60010222)
      An LP isn't analog, it's quantum to molecular scale. I thought Space-Time was quantum to the Planck scale.
      • I thought Space-Time was quantum to the Planck scale.

        Nobody really knows. Many argue that space time is not quantized. Neither can prove it.

      • The Planck scale is the smallest you can measure, not the smallest you can be.

        • by Wargames ( 91725 )

          I suggest that if you are smaller than Plank scale you are not.

          • by Wargames ( 91725 )

            Too bad I can't fix the spelling in my own posts.

            I suggest that if you are smaller than Planck scale you are not.
            Also.
            I suggest that if you are smaller than Planck scale you are naught.
            Also.
            Some, but not I, suggest that if you are smaller than Planck scale you are knot.

        • When it is said "the smallest YOU can measure", that means the smallest that ANYTHING can measure anything else, or, I think equivalently, the smallest at which any information or non-random influence can be transferred from something at one location to something at another.
          So Planck-scale is a constraint on particle/wave/field interactions I suspect, in that they cannot exist in any inter-constraining fashion below that scale.
          In other words, whatever is, and happens, below that scale must be open to being
    • Apparently the terms are "smooth" or "chunky".

    • Magnetic Tape.

  • ...that when you get down to this sort of depth what reality actually is, is probably beyond the understanding of evolved ape brains. After almost 100 years no one has yet explained the results of the quantum double slit experiment - lots of guesswork but thats all it is - never mind even more complex questions about the universe.

    Much as it would be nice to believe we'll eventually have a Theory Of Everything IMO thats just wishful thinking. Perhaps in the future AI may push us forward further but I suspe

    • Re:I suspect... (Score:5, Informative)

      by Sique ( 173459 ) on Friday May 01, 2020 @04:48AM (#60010096) Homepage
      In this case, it was a quite easy idea. If I look at something through a frothy space-time (like a fresnel lens), I probably will see some quantization artefacts, if I magnify it enough. But apparently, light shining through a gas cloud 18 billion light years away doesn't show any quantization artefacts. So we have an upper limit for the size of a single quantum of spacetime, ruling out any theory that uses a larger quantization raster.
      • by Calydor ( 739835 )

        The part I as a layman don't get is that last I heard, the universe is approximately 13-14 billion years old; how are we on Earth seeing light that comes from a distance of 18 billion light years?

        • Space expands, so objects that are currently 18 billion light years apart would not have been when the photon left.
        • Re:I suspect... (Score:5, Informative)

          by Sique ( 173459 ) on Friday May 01, 2020 @05:48AM (#60010186) Homepage
          The distance of 18 billion lightyears (in the paper, it's 5.8 Gigaparsec) is the so called comoving distance [wikipedia.org], basicly the distance renormed via the Hubble constant. The gas cloud is at redshift z=2.34, and 18 billion lightyears is the distance right now, if we freeze time and then measure the distance.
        • by alexo ( 9335 ) on Friday May 01, 2020 @09:09AM (#60010506) Journal

          While objects cannot move faster than the speed of light with respect to each other (special relativity), space itself can expand faster than that (general relativity).

          To make a very simplified analogy, imagine ants crawling on the surface of a balloon. While the ants cannot move faster than some maximum "speed of ant", the balloon itself may be inflated so rapidly that the distance between two ants will increase faster than they are able to crawl away from each other on a static surface, even if the ants stay put.

      • But I don't think you can magnify something 18 billion LY away enough to see the quantization without noise swamping your data.

        • by Sique ( 173459 )
          That's why the team has chosen a signal which can be easily distinguished from the noise: A special line in the spectrum of iron.
        • Re:I suspect... (Score:5, Informative)

          by Sique ( 173459 ) on Friday May 01, 2020 @01:01PM (#60011538) Homepage
          Or to more exact: The team did what a high fidelity tester would do to test equipment: Send a very clearly defined signal through the equipment and then measure the difference between input and output. The difference is the noise. In this case, the equipment was the empty space between the interstellar cloud and our measuring equipment, and the signal was a specific absorption line in Iron. Iron has the advantage, that it is not Hydrogen or Helium, which are quite abundant even in mostly empty space. What the team was measuring was how much the absorption line signal has leaked into neighboring wavelengths to determine the smoothness of the interstellar space between said cloud and Earth. And the result was, that the line didn't leak into neighboring wavelengths at all, so the spacetime at least from there to here is so smooth that our measurements didn't find any noise at all.
    • ...that when you get down to this sort of depth what reality actually is, is probably beyond the understanding of evolved ape brains.

      That's not a limitation of our brains. We have plenty of theories.

      After almost 100 years no one has yet explained the results of the quantum double slit experiment

      We're lacking the funding and/or technology to prove/disprove the theories one way or another. You can't do it by biting the quantums or bashing them with rocks.

      • FWIW: Feynman gave a very credible explanation for the double-slit experiment back in the 1960s.

        • by Viol8 ( 599362 )

          Link?

          He didn't explain anything, he just described what happened with a guess as to why. Which sums up quantujm mechanics in general - nobody to this day knows how its wired under the board so to speak.

          • He also explained why it's impossible to ever test it.

            My response was to the person who said our monkey brains were incapable of understanding it. That's not true at all, we're perfectly capable of understanding it.

            • Einsteins theory only makes sense in the context of movement , ie a curved trajectory. It never explained why a non moving object will start to move and gain kinetic energy in a gravitational field or where that energy comes from (the big bang which wasn't known about then).

              • "Einsteins theory only makes sense in the context of movement , ie a curved trajectory. It never explained why a non moving object will start to move"

                Of course he did! So much so, that's the core of what he did.

                "There ain't such a thing as a non moving object". That was his answer.

            • LOL. I refer you to Feymans quote about understanding quantum mechanics.

          • by Kjella ( 173770 )

            He didn't explain anything, he just described what happened with a guess as to why. Which sums up quantujm mechanics in general - nobody to this day knows how its wired under the board so to speak.

            I think if you really dig into it the world is unexplained, like we have a formula for gravity but like if I jump what's actually pulling me down? Even in a vacuum where there's nothing there's still something pulling mass together. And why does it want to lump together in the first place? At some point you just have to say that's simply what we've observed and if there's an even more fundamental underlying model we haven't found it yet.

            • Gravity was explained perfectly well by Einstein. Nobody's thought of it as a force for a very long time now.

      • by Viol8 ( 599362 )

        "That's not a limitation of our brains. We have plenty of theories."

        Anyone can come up with a theory - I could say its due to dancing green ducks on the moon. Coming up with a provable explanation as to what is really going on is something else entirely.

        • ...due to dancing green ducks on the moon.

          If you wrap that theory up in a conspiracy, perhaps linking it to 5G towers and the fake moon landings, you'd likely have more believers than you could shake sticks at.

          "If we really landed on the moon, why didn't we see the ducks!?!?"

        • by paiute ( 550198 )
          You mean we have plenty of hypotheses.
        • Anyone can come up with a theory - I could say its due to dancing green ducks on the moon.

          The phrase you want is "plausible theory".

    • The interesting thing with science is that is not game you can win, but only get better at.

      For every answer you solve, it opens up may more questions.

  • by poptopdrop ( 6713596 ) on Friday May 01, 2020 @04:38AM (#60010074)
    You can guarantee that if it was shown to be digital, there would be some hairy-arsed dickhead moaning on about how it was much better when it was analog but if you use an iron cloud instead of gold you won't notice the difference.
  • Am I the only one who thought it was going to be about the software Quantum ESPRESSO [wikipedia.org]? Ah, nevermind.
  • by Harold Halloway ( 1047486 ) on Friday May 01, 2020 @06:06AM (#60010218)

    It's turtles all the way down.

  • i.e. Dividing the universe into voxels [wikipedia.org] of a tiny but fixed, uniform size.

    That's not the only way to quantize a simulation though. e.g. If you're simulating linear distances, you can use an int or a float. With an int, distance is quantized with discrete steps equal in size. With a float, the steps are logarithmically distributed about the origin - high granularity at the origin, low granularity far away. (If you used a new variable which is a combination of int + float, you can center the high granul
    • You're losing accuracy the further you go, as more and more bits of your float turn to zero. Ints suffer that too, you just immediately drop to zero once you get outside its rasterization area.

  • by Proudrooster ( 580120 ) on Friday May 01, 2020 @07:15AM (#60010278) Homepage

    The universe is a digital system simulating analog at a very high bit rate.

  • by DavenH ( 1065780 ) on Friday May 01, 2020 @08:00AM (#60010326)
    I think we got started off on a false premise. The quantum nature of small things does not mean [youtube.com] they're all quantized, rather that they are in principle unmeasurable to arbitrary accuracy. That certain particles exist in discrete energy states in the context of atoms (electrons) or is not fundamental to quantum mechanics.
    • I think you're forgetting that small things not only cannot be measured by "us", but by each other also. They can't transfer information/constrained influence to each other (and be interacting in patterned ways) below that scale.

      Everything is an observer (every particle/wave,field location what have you) of potentially anthing else within its light-cone, provided there is enough matter-energy over here at the observer to encode something of what happened over there at the observed space-time location. That
  • Um...no... (Score:5, Interesting)

    by bradley13 ( 1118935 ) on Friday May 01, 2020 @08:20AM (#60010362) Homepage

    "In order for the math of general relativity to work, this fabric of space-time has to be absolutely smooth at the tiniest of scales."

    It's been a while since I could do the math, but (iirc), this is not true. It's true that relativity doesn't expect quantization, and that we normally think of formulae as working on continuous variable. However, there's not really anything that prohibits quantization.

    As one counter-example, consider the recent brainstorm by Stephen Wolfram [stephenwolfram.com]. He proposes essentially a form of automa theory - very much quantized - from which he is able to derive the basic formulae of general relativity.

    Whether or not his ideas hold up, they certainly demonstrate that there is no fundamental incompatibility between relativity and quantum mechanics.

    • Sadly, Wolfram's theory (and most others I've seen) are all missing one important thing: What causes change?

      Wolfram's system assumes something can "process" transformations. But there is no cause for those transformations, other than just the axiom "changes occur."

      I'd be very interested to see a theory that can explain the "origin of change".

      • What about, and this is n-th level (as if psilocybin-induced) speculation, if we postulate, yeah like the axiom you mentioned, that
        - all logically-possible changes could be occurring, but also that
        - there is some-kind of dependency on energy (or the underlying "substrate" pre-cursor of energy) which means that
        - there is a probability/"measure" function/principle operating that means that
        - simple, local-ish (in the graphs and in steps/# of participants) changes are much much more probable (or that, only the
        • By the way this "change needs to be there to make it go" problem also affects the Tononi's IIT "theory" of consciousness.
          Clearly, a whole-bunch of densely, complexly internetworked nodes, with both inter-representation and representation of the external, is not going to be conscious unless something is (probably massively parallel-ly/quantum-computationally) "information-processing", sieving through the information-network, expending energy over time to do so.
    • Wolfram's "work" is equivalent to the fact that an infinite number of functions can make a graph through a finite number of points. He has nothing testable, useful nor actionable.

      Meanwhile, physicists know GR is in fact totally incompatible with quantum field theory, the equations are irreconcilable and there are long ongoing attempts to formulate something that is as useful in both their realms as either one is in certain realms.

  • 18-Billion Light Years away? If the Universe is only around 13.8 Billion Years old (depending on who you ask,) how is that possible to see?

    • by flink ( 18449 )

      18-Billion Light Years away? If the Universe is only around 13.8 Billion Years old (depending on who you ask,) how is that possible to see?

      The photons were emitted less than 13 billion years ago, but the universe expanded in the meantime while they were on their journey, so their point of origin is now 18 billion ly away. This is observable as redshift in the photons.

    • Space is expanding, the visible universe has 46.508 light years radius. In the 90s it was discovered the rate of expansion is increasing too, which was named "dark energy".

  • Not even the most well funded scientific organization in the World knows the answer to this question. Does the guy who asked actually expect to find reasonable answer from /. ?
  • Comment removed (Score:5, Insightful)

    by account_deleted ( 4530225 ) on Friday May 01, 2020 @09:31AM (#60010574)
    Comment removed based on user account deletion
  • When you do the derivations to arrive at what the speed of light must be, you use other numbers which represent quantized states.... so one would think that therefore Space-Time would also be quantized.
    • False, the other constants that derive it are not. Fine structure constant is not quantized, nor are permittivity and permeabililty of free space.

  • No.

  • If this is a computer simulation. With bits we can only store finite precision numbers.

  • i'm surprised that quantized spacetime doesn't come up more as an explanation for the star trek transporters. it implies that everything must teleport the minimum unit constantly.

    Jumping around the universe is literally a fundamental component of all motion. you no longer have to wonder if you are dead and cloned. (well... as an effect of the transporter. I guess if you think about it a lot you have to wonder if you are still you if you've moved 2 meters through the universe) The transporter it's simply

The biggest difference between time and space is that you can't reuse time. -- Merrick Furst

Working...