Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Space

Theory of Information Could Resolve One of the Great Paradoxes of Cosmology 183

KentuckyFC writes: When physicists attempt to calculate the energy density of the universe from first principles, the number they come up using quantum mechanics is 10^94 g/cm^3 . And yet the observed energy density is about 10^-27 g/cm^3. In other words, our best theory of reality misses the mark by 120 orders of magnitude. Now one researcher says the paradox can be resolved by considering the information content of the universe. Specifying the location of the 10^25 stars in the visible universe to an accuracy of 10 cubic kilometers requires some 10^93 bits. And using Landauer's principle to calculate the energy associated with all these bits gives an energy density of about 10^-30 g/cm^3. That's not a bad first principles result. But if the location has to be specified to the Planck length, then the energy density is about 117 orders of magnitude larger. In other words, the nature of information should lie at the heart of our best theory of reality, not quantum mechanics.
This discussion has been archived. No new comments can be posted.

Theory of Information Could Resolve One of the Great Paradoxes of Cosmology

Comments Filter:
  • Numerology (Score:5, Insightful)

    by PvtVoid ( 1252388 ) on Wednesday February 18, 2015 @10:52AM (#49079533)

    Why, for instance, 10 cubic-kilometer voxels? Why not 100, or 1, or 0.1? How about 10^{15} cubic kilometers, which is about the volume of the sun? Adjust this number correctly, and you can match any energy density you want.

    This is the problem with the science blogosphere: they'll take any press release whatsoever and echo it around regardless of whether or not it makes any fucking sense at all.

    • Re: Numerology (Score:5, Interesting)

      by funky_vibes ( 664942 ) on Wednesday February 18, 2015 @10:59AM (#49079581) Homepage

      The idea does actually work if the assumption is that we are living in a simulation, similar to ours. ;)

      • Re: Numerology (Score:5, Interesting)

        by Wycliffe ( 116160 ) on Wednesday February 18, 2015 @11:07AM (#49079625) Homepage

        The idea does actually work if the assumption is that we are living in a simulation, similar to ours. ;)

        That's actually what I thought too. I've actually pondered this before. If we are in a simulation then stuff at the microscopic
        or macroscopic only has to exist when viewed and can be generalized to a much lower resolution the rest of the time which
        would greatly reduce the processing power required. This might also help explain some of the observation effects of quantum
        physics where it seems that things act differently when observed.

        • If you replace simulation with indeterminacy due to the multiple worlds interpretation of quantium mechanics it also dosent require much information from weakly interacting events. I always thought collapse was such a ill defined, mathematically clunky and non-symmetrical contrivance it had to be bs. It makes much more sense, at least to me, that you never really collapse the wave function; only the local appearance of it when you arrive at a definate outcome and that a vast multitude of states, just as
        • by Xyrus ( 755017 )

          The idea does actually work if the assumption is that we are living in a simulation, similar to ours. ;)

          That's actually what I thought too. I've actually pondered this before. If we are in a simulation then stuff at the microscopic
          or macroscopic only has to exist when viewed and can be generalized to a much lower resolution the rest of the time which
          would greatly reduce the processing power required. This might also help explain some of the observation effects of quantum
          physics where it seems that things act differently when observed.

          No, you can't generalize to a low resolution case most of the time because the resulting computed "frames" would begin to deviate immediately from any form of calculation you try to apply. Chaos and all that.

          • Depends on what's simplified. I could imagine stars that were represented by a collection of variables, such that the difference can't be perceived by any current instrument. As the instruments improve, more detail is used on local stars. Back in Greek times, stars were simple, having position, proper motion, spectrum, and magnitude (which could be fixed or varying). When people got instruments that could tell more about stars, the simulators plugged in more detail, keeping it completely compatible wit

      • Re: (Score:2, Insightful)

        Having to apply an arbitrary fudge factor to a calculation just screams BS.

        Also, who says that the universe, at some point, isn't analog, or at least multi-state instead of binary.

        Crap "science" based on a series of crap assumptions. Using the same technique (using arbitrary values and assumptions) we can "prove" that the dark matter is magic jelly beans.

        • Re: Numerology (Score:4, Insightful)

          by LordLimecat ( 1103839 ) on Wednesday February 18, 2015 @11:44AM (#49079847)

          Fun fact: Everything you know is predicated on some set of assumptions.

          • by BarbaraHudson ( 3785311 ) <<barbara.jane.hudson> <at> <icloud.com>> on Wednesday February 18, 2015 @12:15PM (#49080117) Journal

            Fun fact: Everything you know is predicated on some set of assumptions.

            You're just assuming that's true :-)

          • Everything you know is predicated by some set of *axioms*; not assumptions. For example: euclidian vs. non-euclidian geometry. Do you accept axiom 5 as part of your axioms? A || B || C => A || C Good science will investigate it's own axioms.
            • What about the trustworthiness of your own senses? What about the idea that our universe is an inherently logical and rule-following one?

              You cannot "prove" those. You cannot even prove that everything out of your perceptive range continues to exist for the time that you cannot perceive it. Heck, one cant even really prove that their entire experience is not a simulation or dream.

              These are all assumptions, not axioms; they cannot be proven, and must be accepted.

              But more than that, I would wager that you w

              • You cannot even prove that everything out of your perceptive range continues to exist for the time that you cannot perceive it.

                So that crazy old guy who came up behind me and whacked me on the head with a rake (I was picking up my dog poop on public property, but he hates dogs) didn't exist until the rake hit me? And after the cops took him away he went back to fairytale land?

                This is proof that objects don't have to be perceived by you to be real. Any argument that tries to work around that requires so many absurd assumptions that it's just not worth it.

        • The first time we stumbled upon pi or any one of the constants, they were also just arbitrary numbers.

          But sure, it's way too premature to be making any sort of publications about analysis like this.
          If this value starts popping up when doing various unrelated research, it may warrant a closer look.

          • Pi is easily proven to be valid by observation and measurement. It wasn't invented, it's a description of what people had already observed.

            Same with the speed of light in a vacuum.

            This number has no physical manifestation - it's an unfalsifiable claim [logicallyfallacious.com]. There is absolutely no evidence, nor any way of testing, this "theory." We could just as well say that the dark matter is all magic pixie dust - magic because it can't be detected even though it has mass (an undetectable mass? can't happen).

          • Pi isn't a constant [youtube.com]. It is half a constant.
            • That's complete nonsense. Either a constant is;

              - static --> constant or
              - dynamic --> variable

              A constant divided by another constant is _still_ a _single_ constant.

              Vi is simply trolling people.

              • A constant [wolfram.com] is significantly interesting in some way. Fractions or multiples of a constant (which, granted, are just as invariable as the constant itself) are not interesting in and of themselves, but only in relation to the base constant from which they are derived. Pi is only interesting because it is half of tau.

                A circle is the set of points in a plane equidistant from a fixed point. That distance is called the radius. The perimeter of the circle is the circumference. The circle constant should be th
                • If you don't like the extra 2, then use Tau. It is not like you don't have a choice.

                  > Using the diameter is one of the biggest blunders in the history of mathematics.

                  /sarcasm truely First World Problems.

                  Equations are statements of facts. Projecting your _opinion_ and _emotions_ onto them doesn't change the truth about them.

                  First, you argue that the superfluous 2 ...

                  C = 2 * Pi * r

                  ... is sloppy. Now you arguing it is a blunder to use the simpler ...

                  C = Pi * D

                  So which is it? Sl

                • by khallow ( 566160 )
                  Like e^(i*pi) = -1? I don't see the 2 in that one.
              • Ah, you're not an old FORTRAN programmer. Often, small integers would be stored in memory locations (I don't know exactly why). If you passed a constant to a subroutine that passed it to another subroutine that changed it, you could change 4 so it was now 5. If you assigned the constant to a variable in the first place, the variable would be changed by the third subroutine, but that wouldn't change the underlying number. Hence, constants were dynamic.

    • Re:Numerology (Score:4, Informative)

      by Charliemopps ( 1157495 ) on Wednesday February 18, 2015 @11:16AM (#49079671)

      Why, for instance, 10 cubic-kilometer voxels? Why not 100, or 1, or 0.1? How about 10^{15} cubic kilometers, which is about the volume of the sun? Adjust this number correctly, and you can match any energy density you want.

      This is the problem with the science blogosphere: they'll take any press release whatsoever and echo it around regardless of whether or not it makes any fucking sense at all.

      No, they are basing it on Plank Length: http://en.wikipedia.org/wiki/P... [wikipedia.org]
      A unit of measure derived specifically from universal constants, the speed of light, the Planck constant, and the gravitational constant.

      So it's not some arbitrary unit of measure as you suggest. It's the universes unit of measure. (assuming our current model of the universe holds) It's the smallest unit of measure that has any meaning in the real world.

      • No, I think this information theory "approach" uses 10km^3 voxels:

        Specifying the location of the 10^25 stars in the visible universe to an accuracy of 10 cubic kilometers...gives an energy density of about 10^-30 g/cm^3. ...But if the location has to be specified to the Planck length, then the energy density is about 117 orders of magnitude larger.

        So they roughly recover the quantum mechanical (apparently incorrect) result if they use Planck length^3 voxels.

        Not that I read the article of course, but this seems an odd thing to do, as you should probably be confining them to hbar units of phase-space, not just confining them to voxels.

        • Re:Numerology (Score:4, Interesting)

          by rpresser ( 610529 ) <rpresser&gmail,com> on Wednesday February 18, 2015 @11:52AM (#49079899)

          It makes a tiny bit of sense to me.

          "If we use [unstated first principles] to estimate what energy density should be, it's about 10^94 g/cm^3.
          If we use the information content at the Planck scale, it's pretty close -- about 10^90 g/cm^3.

          But we actually observe an information density of about 10^-27 g/cm^3.
          And if we decrease the resolution from Planck scale voxels to 10 km^3 voxels, we get an information density that equates to 10^-27 g/cm^3.

          This is evidence that we are living in a simulation, and the programmers aren't running the universe at Planck scale voxels, but only star sized voxels."

          A large mountain of salt needs to be taken with this argument, but it does make sense -- as an argument.

          • Re:Numerology (Score:5, Insightful)

            by by (1706743) ( 1706744 ) on Wednesday February 18, 2015 @12:41PM (#49080353)
            Good point. I think, though, that they approached it completely backwards: they have presented a method for determining the information-theory voxel size of the universe (or whatever you like to call it), NOT the energy density, as TFS claims. That is, I think they should have started with the correct answer (10^-27 g/cm^3) and derived the voxel size from there. Then we could debate on the physical meaning of this voxel, which is a legitimate thing to talk about.
        • Not that I read the article of course, but this seems an odd thing to do

          Most slashdotters seem to agree with you, for pretty much any article :-)

      • by khallow ( 566160 )
        No, they're basing it on an arbitrary volume unit that was made to be 10^120 larger than Planck length cubed. It still doesn't make any sense in that regard.

        Another approach is to suppose that relative to a single point of space we try to nail down the position of everything we can see to as accurately as we can. We're going to have more trouble nailing down the position of distant locations because it's harder to build out a sensory network to triangulate positions or launch retroreflectors to everythin
      • Re:Numerology (Score:5, Informative)

        by burtosis ( 1124179 ) on Wednesday February 18, 2015 @11:40AM (#49079807)
        Not sure if you are describing it correctly. They are not basing it on the plank length. They show that if you do the energy density of the universe is off by 117 orders of magnitude, close to the 120 orders off if you calculate the energy density of dark energy from first principles. The 10^4 km isn't totally at random, it's based on the free energy associated with encoding a center of mass classically in such a way as to make it unambiguous to independent observers.

        IANAP but it still smacks of numerology because the paper does not make any basis for why the mass of stars is important in any way. There is plenty of ordinary matter not in stars, black holes etc. what would have caught my attention is if it made a case based on mass and not just stars. Or at least gave a relevant basis for why it is negligible to discard non-star matter.

        tl:dr numerology. Though props to the author for saying it can be easily dismissed as numerology in his own paper - that's good scientific method.

      • > It's the smallest unit of measure that has any meaning in the real world.

        That's actually an open question in Science.

        We _assume_ that because we can't _actually_ measure anything smaller then the Planck meter (at this time with our current technology.)

      • :-) -- My caricature of Muhammad. Please don't kill me.

        How about this?

        oO:-|>>

        Is that enough to be blasphemy?

    • Re:Numerology (Score:5, Interesting)

      by lgw ( 121541 ) on Wednesday February 18, 2015 @02:24PM (#49081091) Journal

      Why, for instance, 10 cubic-kilometer voxels? Why not 100, or 1, or 0.1? How about 10^{15} cubic kilometers, which is about the volume of the sun? Adjust this number correctly, and you can match any energy density you want.

      Fundamentally, you can't model the universe as voxels in the first place. The Holographic principle, [wikipedia.org] or at least the part about maximum information density, seems quite solid. There's a maximum entropy available in a volume (and thus a maximum amount of information needed to describe that volume) that's proportional to surface area, not volume. The number is absurdly high, well over 10^100 per square meter, but for extremely large volumes the cube/square effect starts making that limit meaningful. And that limit always prevents you from using voxels of the "natural" size of one cubic Planck length - the precision we know can model everything.

      Perhaps the 10 cubic-kilometer voxels are reasoned from the limit for the visible universe? Still sounds high, even for that volume, and the visible universe seems like an arbitrary boundary.

      • The visible universe is slightly smaller than the hypothetical light cone originating at the time of the big bang. The last neutrino scattering surface is closer to this hypothetical size. The hypothetical light cone is what defines the surface area of the universe, inflationary models, relativity, etc all show how the shape and size of the visible universe is observer dependent.

        that said the 10km cubes come from the free energy associated with representing a classical definate spatial location of all th

  • by retroworks ( 652802 ) on Wednesday February 18, 2015 @10:56AM (#49079553) Homepage Journal
    And is false information "anti-matter"? Could be we will witness the end of the universe in a flame war on /.
  • by Bruha ( 412869 ) on Wednesday February 18, 2015 @11:01AM (#49079589) Homepage Journal

    What if the universe is 120 times larger? Maybe our part of the observable universe just looks like it happened from a Big Bang.

    • What if the universe is 120 times larger? Maybe our part of the observable universe just looks like it happened from a Big Bang.

      Well, actually, the universe is infinite in all directions according most. They're basing their math here on a given volume, "The observable universe" which, makes sense given how relativity works. You know, it's the whole cat paradox. If you cannot observe it, it does not exist, etc...

    • Actually the visible universe is only 1,000 th or so of the minimum size of the actual universe as predicted by inflation measurements. Each second we can see another 186 thousand miles, revealing new 'observable universe'. It's predicted that the actual size of the universe may in fact be infinite, as nearly all plausible inflationary models predict infinite size.
      • Each second we can see another 186 thousand miles, revealing new 'observable universe'.

        Actually that is not quite true. The size of the universe that we can see is actually shrinking. This very counterintuitive result is due to the fact that the universe's expansion is accelerating due to Dark Energy. Hence a distant point in space that is currently moving away from us very close to the speed of light today due to the expansion of space will actually be moving away from us faster than the speed of light tomorrow and so will become causally disconnected from us. So with time our horizon will

        • Not at all. You Are about a hundred billion years premature. Right now dark energy is having a negligible effect, only just now accelerating the expansion noticably in recent cosmological time. the main effect is the hypothetical light cone from the Big Bang to the present - nearly the same as the last scattering neutrino surface. The visible universe (visible light after reionizarion) is not 'expanding' but previously disconnected locations in space time (last connected only during inflation) are com
    • It's not 120 times. It's 120 orders of magnitude or 1,000,000,000, 000,000,000,000,000,000,000, 000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000, 000,000,000,000,000,000,000, 000,000,000,000,000,000,000.

      (the spaces are to get past the lameness filter)

    • by Xyrus ( 755017 )

      What if the universe is 120 times larger? Maybe our part of the observable universe just looks like it happened from a Big Bang.

      For all we know our universe is just the latest in a string of "detonations" due to a locus of instability in the omniverse, Think of it like ripples in water caused by a drop falling into it. Each drop (the "bang") creates a universe and the resulting "wave" pushes the preceding bangs outward, causing expansion. Simultaneously all other bangs are hidden by the peaks of the wave since they all reside in the troughs.

      The waves in this case would represent a multi-dimensional "buckling" as result of the explos

  • If you're starting with the location of stars, that's hardly a first principles calculation...?
  • Is it just me? slashdot seems to be very broken right now

    • by MrL0G1C ( 867445 )

      ...I actually had a relevant thing to post but slashdot lost it.

      • Sorry. We were running a little low on entropy in non-information bearing degrees of freedom of our information processing apparatus or environment.
        Sincerely, Slashdot
        • running a little low on entropy

          You'd think someone who can reverse entropy could keep a damn website up...

    • It was broken for me for a minute or two.

      Also, once in a while I've been getting SSL certificate errors regarding ./ from Chrome.

      • by ledow ( 319597 )

        At times, everything but the front page is being served by a third-party CDN, by the looks of it. When that happens, you get "logged out", and content pages fail with certificate errors because they're not coming from slashdot.org but a cdn domain.

        Either slashdot are under attack and keeping quiet, or they're falling over and keeping quiet.

        • by MrL0G1C ( 867445 )

          My hunch is attack, since slashdot is a tech site it would be nice if they actually told us what is going on.

          And if it is an attack, then is it business or pleasure? Lame either way.

    • Is that because you can or you cannot read the fine articles?

    • Comment removed based on user account deletion
    • Slashdot has been on and off broken since it was down for a full half day - attributed to a drive failure and subsequent corruption. I've been suspecting something has been up for about a week.
  • To start with, I should point out that I am far from knowledgeable on these topics. I took physics in college, but my degrees are in math and CS.

    But I've been reading a little on cosmology, QM and speculations about where our understanding is headed, and it's occurred to me (probably because one of the books I read suggested it; I don't recall) that a plausible explanation for observed reality may be that matter and energy are merely configurations of an underlying "substance": spacetime. Or, if you're a

  • by Anonymous Coward

    "Today's scientists have substituted mathematics for experiments, and they wander off through equation after equation, and eventually build a structure which has no relation to reality." - Nikola Tesla
    "The scientists from Franklin to Morse were clear thinkers and did not produce erroneous theories. The scientists of today think deeply instead of clearly. One must be sane to think clearly, but one can think deeply and be quite insane" - Nikola Tesla
    "There is not self containing theory possible aside from pra

    • by iluvcapra ( 782887 ) on Wednesday February 18, 2015 @12:45PM (#49080379)

      Sounds like a bunch of philistine engineers to me. Armstrong's quote could easily be applied to Einstein or Maxwell. Heaviside probably would have condemned the Manhattan Project as a bunch of theorists.

      It's telling that Tesla draws the line at Morse, who invented Tesla's chosen field of engineering. And Tesla was a brilliant engineer. But later, as an actual scientist and researcher, as someone that had to do experiments and develop new theory, Tesla was a failure. His work was a dead end.

    • "Oh shit, I just blew up your power plant" - Nikola Tesla
    • True if one replaces "scientists" with "non-intuitive scientists".

      The greats -- Newton, Poincare, Einstein, James Clerk Maxwell -- were intuitive.
  • That is what Nassim Haramein is studying at resonance.is and on facebook here: https://www.facebook.com/Nassi... [facebook.com] . Check out one of his articles here: http://resonance.is/firewalls-... [resonance.is] . At the lower end of the cosmological scale lies the Plank Spherical Unit: http://resonance.is/news/quest... [resonance.is] . Page 5 of Nassim's Scaling Law pdf has a nice graph of the universe http://hiup.org/wp-content/upl... [hiup.org] .
    • If we are unaware, then how do we have a hypothesis?

    • by wytcld ( 179112 )

      The concept of "simulation" still requires a reality in which simulation occurs. If nothing exists outside the simulation it's not, in any meaningful sense, a simulation. It's just reality. Also, those who experience a simulation exist outside of it. There is no experiencing of the weather going on within your computer simulation of the weather - although you could do some sort of immersive VR and experience it. But that's because you're in the world it's being simulated from, and do not owe your own existe

  • Color me unimpressed. While somewhat original the whole approach is completely flawed. There are many more things than just stars in the universe. After all, for all we know, the visible universe only makes up a small portion of all matter [space.com].

"If it ain't broke, don't fix it." - Bert Lantz

Working...