Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Science

More on the Fine Structure Constant 196

Bonker writes "Neat news from the Beeb. It turns out that data collected from observation of quasars indicates that the fine structure constant of the universe, aka 'Alpha', may have changed since the universe began. It may have been very slightly smaller than it is right now. The article hints that other constants we're familiar with, such as high, holy 'c', may also vary over time. Of course values can't have changed dramatically, because that would mean that low-weight atoms such as carbon would be unstable, and without carbon, there wouldn't be anyone around to measure the fine structure constant anyway." We ran a story about this last year. It looks like the team has continued to check their work for errors and hasn't found any yet.
This discussion has been archived. No new comments can be posted.

More on the Fine Structure Constant

Comments Filter:
  • While we are at it, we should also chek whether the value of pi is changing.

    After that e, sqrt(2), 1 and 0.

    For those of you who do not know humour: %$#%$
    • Re:Pi? (Score:3, Interesting)

      by os2fan ( 254461 )
      This is not as silly as it may seem. If we're slowly dilating on a hyperbolic space, then the circumference of the circle may be getting bigger...

      Alternately, if we're shrinking on a hyperbolic space (ie staying the same size on an ever growing space), then pi should be getting smaller.

      Actually, unlike 0, 1, and e, pi is not "a fundemental constant", but a convenient artefact that allows circles and spheres to be expressed. For example, one can use any number "k", and express pi in terms of "k". The definition of k would be different, but that's ok.

      For example, if k were pi/4, we would say that the circumference of the circle is 8kr, and its area 4kr^2. For diameters, circ = 4kd, and area = kd^2. This make the circumference and area k times that of the circumscribing square.

      Also, I have played with a set of mathematics, that makes the surface of the sphere 8 pi r^2, with pi=3.14159265359 &c. This has an effect on the "rationalisation" in physics, where 4pi gets replaced by 8pi.

      Mathematics has a lot of preconcieved notions in it.

      • Pi is not defined in terms of physical measurements, it is defined in terms of idealized mathematical concepts. If the universe is "slowly dilating on a hyperbolic space", the the circumference of a hula-hoop would be changing, but the circumference of an idealized unit circle on an idealized plane won't change.
    • Wasn't it a Greg Bear novel, where they had a sensor that checked for variations in the mathematical constants? Something about an artificial superstring, where as the heroine walked closer to it, the value of Pi dropped as low as 2.8. Dammit, can't think of the title, but the superstring was called the "Way".
    • Re:Pi? (Score:3, Funny)

      by Tablizer ( 95088 )
      "While we are at it, we should also chek whether the value of pi is changing. " After that e, sqrt(2), 1 and 0.

      Hmmm. Maybe in a dozen billion years or so, my slashdot Karma may be worth more also.

  • The BBC site doesnt work for me. It couldnt be /.ed already?

    From what I remember the change in the alpha value, if there is one, would only be by about a billionth of what it is today.

  • by astrophysics ( 85561 ) on Sunday May 19, 2002 @01:32AM (#3544832)
    It's that same group saying the same thing again. Although I haven't reviewed their latest paper, I remember that I wasn't impressed with the statistical analysis of their data, as of the previous paper.

    Personally, I won't find the evidence convincing untill another group takes some their own data and gets similar results. Given that many astronomers have similar sentiments, it seems that giving VLT time to the same group seems not the best use of VLT time.

    Of course, if no other astronomers find the likelyhood of the discovery worth the effort of making the observations, then it may be difficult to get independant confirmation. Given that it would be a really big deal if true, I think that says a lot about how seriously the astronomical community takes these claims.

  • by TheFlu ( 213162 ) on Sunday May 19, 2002 @01:32AM (#3544834) Homepage
    The number of dates I've had in the past year. Of course this tends to cause some division by zero errors.
    • A more useful number for me would be the fraction of successful dates (*1), which, while non-zero, can be seen to converge to within espilon of zero as T goes to T(divorce) + infinity. We can represent this value by the lowercase Greek letter sigma (*2).

      Raise sigma to the power of the money spent on those dates (which, perhaps counter-intuitively, appears to be inversely related to sigma itself), and we have a value that can be substituted for zero for most practical purposes, while remaining safe for division, though it may strain the limits of floating-point precision.

      --
      (*1) For any given meaning of "successful". I'll leave it to you sick monkeys to guess whether I mean what you think I mean.

      (*2) For reasons that should be obvious.
  • No hints about c (Score:5, Interesting)

    by NoBeardPete ( 459617 ) on Sunday May 19, 2002 @01:35AM (#3544840)

    The article actually doesn't really hint that 'c' is changing, which is good, because it's not clear what would be meant by that. The article says that several physicists have previously wondered if it could change. It then goes on to quote a modern physicist as saying that they were wrong.

    I think c is best thought of as a man made constant. Just as I might say that there are 2.54 centimetres per inch, I can say that there are ~3*10^8 metres per second. Neither of these really contains any information about the universe outside of our perception of it. It is simply a statement of how one one system of measurement compares to another. 2.54 centimetres per second evaluates to unity (the number 1, with no units) if you actually evaluate it. Likewise, physicists commonly use unity as the speed of light, because in a very meaningful way, it is.

    If I suddenly magically increased c by 10%, that would be indisinguishable from stretching the universe by 10% in every spacial direction. Consider that the speed of light it essentially unity, and that expressing it otherwise is really more a statement of our systems of measurement that we use than of physical reality. This makes it seem silly to say that I have magically increased c by 10% and makes it seem more reasonable to say that I have stretched the universe by 10% in every direction.

    • Dunno if it'd be worse for me to be right and nitpicking, or wrong and stupid, but wouldn't c 'increasing' by 10% be equivalent to the universe 'shrinking' or compressing by 10%. It would seem that moving around faster is the same as everything being smaller...
    • That's right. The speed of light 'c', is just a conversion factor (like 2.54cm/inch).
      Before, we could measure lenghts more accurately than velocities, and so we defined c in terms of meters per second.
      But now, since we can measure velocity with more precision and accuracy than distance, a meter is now defined in terms of the length of a second and c.
      Now:
      The second is the duration of 9,192,631,770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the cesium 133 atom. (think atomic clock)
      The speed of light in a vacuum is defined to be exactly 299,792,458 meters per second.
      The "length" of a meter then is a derived value!

      So there are consequences of a changing c, since a different value of c will change the amount of energy needed to accelerate a mass, and will affect the structure of the universe....but I'm not sure how they would go about figuring that out. :)
    • by floW enoL ( 312121 )
      Um...c is not a "man-made" constant. Although you're right that it's simpler to set it to 1 and have all the units change around it, increasing c by 10% would have some noticeable effect.

      c is the speed at which electromagnetic waves propagate. calling it the speed of light is somewhat of a misnomer; it might be better to say that light moves at the speed of propagation of electromagnetic waves, since it is, after all, an electromagnetic wave. Furthermore, it turns out that the wave equation implies that c = 1 / square_root(e_naught * mu_naught), where e_naught is the permittivity of free space (ratio of charge to electric flux in vacuum) and mu_naught is the permeability of free space (ratio of current to magnetic flux in vacuum). These two are experimental constants which the speed of light happens to depend on (although now the speed of light is taken to be a fundamental definition). Therefore, an increase the speed of light by 10% would imply an increase in either or both of the fundamental constants, which may have drastic effects, comparatable to G (the universal gravitational constant) being 10% greater.
      • The speed of light in a vacuum cannot change, by current definition. The meter is currently defined as 1/299,792,458 of the distance light travels in a vacuum in one second. Of course that hasn't always been the definition: the basis for the meter has been, in historical order:


        (1) The quadrant of the Earth.


        (2) The length of a particular metal bar.


        (3) The wavelength of a particular atomic spectral line.


        (4) The speed of light and the frequency of an atomic clock.


        Each change improved the reproducibility of the best length measurements, given the technology at the time the change was made.


        The observations that suggest alpha varies are based on comparing wavelengths of light from different atomic oscillations, potential distance standards similar to (3) above. They appear to vary relative to each other. If you want to attribute this to varying c, which one is the reference yardstick?


        For our present technology the most reproducible clocks and yardsticks are atomic oscillations. If these lack relative constancy and you choose the frequency of one as your time standard and the wavelength of another as your length standard, you will apparently observe a changing value of c. However, the direction and magnitude of the change will depend on which pair you choose. If we had really independent distance and time standards (and it was clear which was which) it would make sense to consider c an experimental quantity. Since we don't we have just chosen one standard (a particular oscillation of cesium), and c is a defined constant.


        Similarly, the electromagnetic quantities "epsilon nought" and "mu nought" were once experimental quantities, but are now by definition exactly {10^7/(4 Pi c^2)} and {4 10^-7 Pi} respectively. This means that the coulomb is no longer defined electrochemically: it is a derived unit, not a fundamental one in the SI system.

    • 'c' is more than just an arbitrary constant, it does have fundamental meaning, as it relates to the scaling of the time vs. the space dimensions.
      Think of c as an aspect ratio. The ratios of the space dimensions are (usually) 1, which is like a computer screen made of perfectly square pixels layed out evenly; if you rotate an image on that screen, it doesn't change at all. If the ratio is not 1, then rotating the image makes a difference, and the distortion (circles becomes ellipses, faces get squished, etc.) is dependent upon the ratio.

      Where is this 'distortion' then, in space-time? It is all the weird things that happen when objects approach the speed of light. Moving between reference frames is like rotating the screen, and thus the value of c influences these behaviors. So what? Even if you're not a space traveler, things like the color of gold are dependent on this number. Shift 'c' enough, and it's lead that's pretty and gold dull.

      The units we use may be convenient and flexible, but space and time are not arbitrary, even ignoring relativity. For example, you could measure the speed of light in terms of (diameters of uranium nuclei) per (period of light emitted by first ionization of hydrogen) or any such, and it should be clear that changing this velocity would require rewiring the universe.

      -AlphaGeek
    • People who are interested in this stuff should check out this preprint by Duff, Okun, and Veneziano:

      http://arxiv.org/abs/physics/0110060 [arxiv.org]

      It's a discussion of how many "dimensionful" fundamental constants there really are. I went to a seminar given by Mike Duff here in Ann Arbor and he had me pretty well convinced that the only numbers that matter are things like alpha, and if you were to change c, h, etc together in ways which left alpha unchanged you wouldn't be able to tell the difference.
    • If I suddenly magically increased c by 10%, that would be indisinguishable from stretching the universe by 10% in every spacial direction.

      Oh well, that's OK then ;-)

      Um. Actually, you would effectively shrink the universe by 10%.

      Seriously though- that would be very, very bad. Consider the moon in its orbit. It's sitting up there, and suddenly it's 10% closer. It's mass is the same and the mass of the earth is the same, but suddenly its 10% closer, and going 10% faster as well.

      Basically, that would change its orbit, really noticeably, but that doesn't matter.

      The earth would be a different distance from the Sun- that would have devastating effect on the biosphere. It would get really hot. But that doesn't matter either.

      Far worse- all of the nuclear bonds holding the earth together would suddenly be 10% too short. The earth has been shrunk by 10%, and it doesn't like that. The earth would explode back out to its original size, overshoot, and finally relax to its original size; liberating immense energy. Everything on the earth would disintegrate. Without doing the maths a quadrillion on the Richter scale sounds about right. (Note that the Richter scale is a logarithmic scale.) That would matter.

  • by trb ( 8509 ) on Sunday May 19, 2002 @01:44AM (#3544860)
    Whoa, time to change those #defines to doubles.
  • by Bonker ( 243350 ) on Sunday May 19, 2002 @01:49AM (#3544867)
    Quoth the article:

    "If this is correct, it will radically change our view of the Universe. We have to be cautious but it could be revolutionary. We have seen something in our data - but is it what we think?"

    I like it when scientists talk about their theories in this manner. On one hand you have a whole body of researchers, scientists, and journals who are so afraid to rock the status quo that they refuse to research (or publish) controversial information. On the other, you have scientists and/or crackpots who are so paranoid and skittish towards working within the peer reveiw system [bbc.co.uk] that we'll probably never gain access to their research, some of which may be quite important and revolutionary.

    (I quit my physics major a year in and switched to CS. At what point do 'paranoia' and 'ego-building' become required courses?)

    I think this is a nice middle ground. These guys have announced a neat finding, with the caveat that they are still in a thourough 'error-checking' mode and are looking for problems with their own research and are implicitly inviting others to do the same.
    • That is real science at work. The sound of discovery isn't "eureka I have found", but rather, "hmm, that is odd", or in this case, "there is something here, but what is it?"
  • Is anyone else irritated because the Slashdot icon in the upper right hand corner no longer sends you back to "www.slashdot.org" but instead to the topic page (Example: "science.slashdot.org" or "ask.slashdot.org)? It's bugging the shit outta me.

    Can't spell
    Can't conjugate
    Can't proofread

    and now...

    Can't keep consistency.
  • If C isn't constant, then perhapse you can shine that flashlight off the front of the ships at near light speed and achieve faster than light with light itself ^__^

    I never claimed to be a physics expert, just an expert in physics
  • From the introduction to Bit String Physics [amazon.com]:
    This interest of mine in scientific revolutions remained casual, until I heard Ted Bastin talk about the combinatorial hierarchy in 1973. This remarkable construction, "discovered" by Fredrick Parker-Rhodes in 1961, yields algorithmically the sequence 3, 10, 137 (~hc/e**2), 2**127 + 136 (~1.7*10**38 ~ hc/Gm(p)**2) and cannot be extended past the fourth term for reasons intrinsic to the construction. Why a simple mathematical algorithm should have anything to do with two of the fundamental dimensionless constants of modern physics remained unexplained, and so far as I am concerned remains unexplained to this day. It could -- as the prevailing paradigms in theoretical physics seem to require -- just be a coincidence, like the "prediction" by Swift that Mars has two satellites. To make it plausible that, although still mysterious, the fact that the number of entities calculated for the third and fourth levels of the combinatorial hierarchy correspond closely to the two dimensionless numbers which characterize the two long range, macroscopic forces observed in nature (electromagnetism and gravitation) is probably something more than a coincidence is a main objective of this book.
  • Oh well... (Score:1, Flamebait)

    by Dr. Bent ( 533421 )
    There goes my theory about the universal constants containing a message from the Creator of the universe. Guess I'll have to start working on that old universal field theory again.
  • ....does that mean we all have to go back to programming in Cobol? ;-)

    -marc
  • by ChenLing ( 20932 ) <slashdot@@@ilovedancing...org> on Sunday May 19, 2002 @03:41AM (#3545043) Homepage
    alpha is the coupling constant for the electromagnetic force.
    In other words, it determines the "strengh" of the electromagnetic force. It is important because
    a) it has no units (it's just a number, approximately 1/137)
    b) it is easy to measure to a great degree of accuracy
    c) it can be measured using a variety of different experiments
    d) many fundamental phyiscal constants (such as c - the speed of light in a vacuum, e - the charge of an electron, and h - the Planck constant.

    So a change in alpha would mean a change in one of the fundamental constants of physics.

    For more information, you can read NIST's wonderful description [nist.gov].
  • I think there is something to this varying constant idea. Infact there are some facts about the universe that simply do not tie up with a constant c. Joao magueijo of imperial college has written are interesting papers looking at the implications of this here [lanl.gov] , here, [lanl.gov] and here [lanl.gov]

    in your favourite format.

  • I used my Q powers about 100 billion years ago to alter this constant to impress a lady Q -- forgot to change it back -- I will get right on it.

    "You dont ask how some things are done, you simply do them." - Q
  • "Of course values can't have changed dramatically, because that would mean that low-weight atoms such as carbon would be unstable, and without carbon, there wouldn't be anyone around to measure the fine structure constant anyway."

    Who is to say that carbon has always been stable... maybe one of the more unstable elements today was the stable element at the time and has become unstable as a result of the change in the constant value.

    • How about you look into deep space and see absorption or emission lines corresponding to carbon in ways that show that way back in time carbon must have been stable to exist in sufficient quantity to produce those features?


      I very much doubt that the stability of carbon has varied much in the lifetime of the universe.

    • "Of course values can't have changed dramatically, because that would mean that low-weight atoms such as carbon would be unstable, and without carbon, there wouldn't be anyone around to measure the fine structure constant anyway."..... Who is to say that carbon has always been stable... maybe one of the more unstable elements today was the stable element at the time and

      Perhaps in say 100 billion years from now, carbon *will* become unstable and carbon-based life-forms will evaporate (assuming no gradule evolutionary path to some other compounds). The universe may become a tough or boring place to survive down the road: less energy to harnass, carbon growing useless, no pretty pictures from a Hubble-like scope because everything is dispersing and cooling, no starry nights to jump-start your date's libido.

      Eat, Drink, and Be Merry, for tommorrow is entropy.

  • Over the past two years, I've developed a decent "haha, only serious" model of the universe. It works sort of like this:

    About two years ago, Slashdot ran a story talking about the theoretical upper limit of computer speed (sorry, couldn't find a link). Basically, the idea was to convert the mass of your computer to energy to allow ALL of it to work for you. This energy, in the form of light, will create intereference patterns - just like you did with the two slits in 5th grade science - and that's how the computer (which now resembles a small star) does it's computing kinda thing (gross oversimplification of what the article said, but that's the gist). Now if you compress enough energy into a singularity, you have pretty much (and the "pretty much" is important) infinite computing power (due to time dialation and so on).

    Well, it just so happens that God has one of these things on his desk. Our universe is a program running inside this uber-computer that resembles a black hole.
    Earlier I said the processing power of this computer would be "pretty much" infinite. Well - it isn't big enough to handle every particle in the universe simultaneously. Some of the universe is "swapped out". Ever sit down at the computer to read slashdot, and whammo, four hours have gone by? Wonder what happened to the time? You were swapped out, that's what.
    There also appears to be problems with the branch prediction unit of this computer. Deja vu? branch prediction made an error, and the queue had to be recalculated. Ever reached in your pocket and pulled out a $5 bill you didn't know you had? bad branch prediction.

    If a tree falls in the woods, and no one was there to witness, does it make a sound? No. It didn't even fall. Actually, it wasn't even there. Years later, when a witness comes upon the site, all the events since the last witness came by are quickly approximated and the end results are what the new witness sees. What constitutes a witness? People? squirrels? I dunno. Doesn't matter, really.

    Can't remember if you left the oven on? Well, both options are possible, and both have been approximated. The appropriate one will be chosen when someone sees the end result (either your house burns down, or it doesn't).

    Lots of strange events can be explained with this model of the universe:

    Reincarnation/past lives/Ghosts? Bad garbage collection, or the Divine Coder forgot to unallocate memory.

    ESP? Packet snooping.

    Why can't objects with mass go faster than the speed of light? Think of everything like an object in C++. If you have a "mass" property, your object is too big to fit through the "bus" in one "fetch cycle", so your "position" property can't be updated as fast as say...a photon, which fits through the bus in one cycle.

    Why is the rules of Quantum Mechanics so strange/Planck's Constant? In the world of computers we know, what's smaller than a bit? Looking at things on that small a scale, we're seeing the individual bits flip from 1 to 0 in God's workstation. Of course it will look odd, and it won't mean much when compared to the world as we perceive it. Combine that with the fact that most of the universe is approximated, and you end up with really strange things happening on that small a scale.

    Why are some people luckier than others? Not all people call the same random number generator, or maybe some people can call it with a certain "seed value".

    Bermuda triangle? think of something like a bad sector on a disk, or a faulty RAM stick - of course, the computer this runs on doesn't use disks or RAM sticks, but it's still a decent analogy.

    Jesus? You play Quake/Unreal/The Sims, don't you? It just so happens that God's version of "The Sims" is a hell of a lot better than yours.

    Don't think of this as something akin to the movie "The Matrix" - because these rules we live by in this universe can't be broken. There's no dodging bullets. there's no agents... We were created parts of this simulation, and are ourselves simulated and no more or less real than the world we live in - and there's no way to get out of this simulation.
    However, maybe there is a way to use the rules to our advantage? But to do that, you need to know the real rules behind the physics we see. We'd need to know what's happening to those individual bits in the processor. If we can affect those often enough, maybe we could effectively beat the rules...?

    More important is this question: Were we created on purpose, or is this entire universe of ours that exists inside God's Workstation meant to be something else entirely? Maybe we were supposed to model plasma dynamics, and the system taking on intelligence was a by-product of the genetic algorithm that was used? Or maybe we're something like an AI experiment?
  • I would just like to take this opportunity to point out that no other science has quantities in it that have names as cool as the fine structure constant or the permeability of free space.

  • From the article, it seems that the thing they are measuring to understand the nature of how this 'constant' changed is the light that eminated from the rest of the universe that is just reaching earth. The older light appears to show matter generally acting in one way, and the newer light appears to show matter acting in another way.

    How did they isolate this one factor in sub-atomic formulae as the only feasible explanation? How did they eliminate things like universal gravity effects (gravity appears to be instant and with unlimited range), forces acting on the light over billions of years, or changing nature of the stars as that portion of the universe ages, thus changing the light coming from them?

    This does qualify as one of those 'extrordinary claims' that themselves need both extrordinary proof and extrordinary qualification of what they are really stating.

    :^)

    Ryan Fenton
  • by dimator ( 71399 ) on Sunday May 19, 2002 @05:08AM (#3545144) Homepage Journal
    Everyone seems to love carbon. It is highly overrated if you ask me. Hydrogen, now there's an element...

  • just like C (Score:2, Funny)

    by g4dget ( 579145 )
    const double alpha = 1.0/137;

    hack_universe() {
    *(double *)&alpha += 1e-9;
    }

    // don't call this; the universe requires
    // double-word aligned doubles

    crash_universe() {
    *(double *)(1+(char *)&alpha)) += 1e-9;
    }
  • Maybe I'm missing something from the article, but I don't think that "constants" like alpha changing is a new idea (though it is very cool if alpha changing over time can be directly observed like this). Changing coupling "constants" is already a part of the established "Standard Model" of physics, and is an essential feature of Grand Unified Theories.

    Grand Unified Theories rely on all of the interaction strengths for all known forces (Strong force, weak force, electromagnetic force, and sometimes gravity) becoming the same at some energy scale earlier on during the formation of the Universe. In the present Universe, the strong force that holds quarks together is much stronger than the electromagnetic force, but if GUTs hold true then they were much closer earlier on.

    See here [innerx.net] for a graph illustrating this effect, or rather its failure for one particular GUT theory. This is the first I found using a quick google search for "GUT" and "coupling constant"; it is a common plot shown for papers on GUTs in general.

    Its been a couple of years since I studied this stuff. I'd be interested to know if this article is pointing to something new theoretically.
    • Maybe I'm missing something from the article, but I don't think that "constants" like alpha changing is a new idea (though it is very cool if alpha changing over time can be directly observed like this).

      You're right, it's not a new idea. The first physicist to suggest time variation of basic constants seriously was P.A.M. Dirac. His reasoning was a bit fanciful. He calculated certain dimensionless numbers by appropriately combining various dimensionful basic constants of physics and found that he got very large numbers.

      Since the numbers were very large, and one knows that the universe is very old, Dirac proposed that the great age of the universe was the reason the dimensionless combinations were large numbers: it was related to the large age of the universe.

      If this was true, it then followed that the fundamental constants had to be varying with time and would have been different in the very early universe.. This is called Dirac's `large numbers hypothesis' for obvious reasons. It's a wild, but a very imaginative suggestion.

      Changing coupling "constants" is already a part of the established "Standard Model" of physics, and is an essential feature of Grand Unified Theories.

      Yes, but the kind of change of alpha being discussed in the article is most definitely not a part of the established standard model.

      Strictly speaking the standard model is only partially unified: the electroweak part or SU(2)xU(1) is unified and matches well with experiment, but the strong interaction SU(3) is not so easy to unify with SU(2)xU(1). The simplest GUT to accomplish the SU(3)xSU(2)xU(1) unification was the minimal SU(5) GUT, but it predicted decay of the proton at a rate which was experimentally ruled out some years ago by direct measurements.

      The best bet for grand unification of strong, electromagnetic and weak interactions at the moment would be some form of what's called the minimal supersymmetric standard model. This theory is beyond the standard model because it predicts supersymmetric particles, which are so far unobserved.

      But this theory is, like the standard model, basically a theory of particle physics alone. It doesn't even address gravity, much less the possible time variation over long times of fundamental constants evaluated at a *fixed* energy scale. Such a time variation breaks the Lorentz invariance of all of these theories. For that matter, such a variation breaks the general coordinate invariance of general relativity. If what is being said is true, and it is a very big if, it requires a radical change in the accepted theories.

      Grand Unified Theories rely on all of the interaction strengths for all known forces (Strong force, weak force, electromagnetic force, and sometimes gravity) becoming the same at some energy scale earlier on during the formation of the Universe.

      Well, true, but you conflate several ideas here.

      There are so-called running coupling constants in the standard model and in the various grand unified models (excluding gravity for now). So for example, if two electrons smash into eachother at very high energy, say 100 GeV or so, then the appropriate value of alpha to use in calculating their interactions is more like 1/127 than 1/137.

      This running of the coupling strengths is well tested and is predicted by quantum field theory. If the strong, electromagnetic, and weak forces are actually part of a GUT, and the big bang theory is correct, then at some early time in the history of the universe, the average energy of interactions of electrons would have been 100 GeV for example.

      At that time it would make sense, on average to use a value of alpha which is 1/127, but that in no way means that this larger value of alpha should be used at that time for two electrons which collide with much lower energy. If they collide with very low energy, the appropriate value of alpha is still 1/137 if the standard model is correct. You see the distinction?

      The observations in question are actually of atomic transitions in very distant gas clouds, so they involve very low energy interactions of electrons with positively charged nuclei. So the running of the electromagnetic coupling, alpha, is not relevant to these measurements, and if any real deviation in the observed atomic transitions can be shown from what is observed at the present day, then it means atomic structure has changed. That would mean the laws of physics are really varying as the universe ages. It's quite radical. It means throwing away the standard model and general relativity.

      In the present Universe, the strong force that holds quarks together is much stronger than the electromagnetic force, but if GUTs hold true then they were much closer earlier on.

      Yes, this is an effect of the running coupling constant in GUTs, together with the big bang theory. At high temperatures, like existed in the early universe, all of the couplings would have looked the same.

      See here [innerx.net] for a graph illustrating this effect, or rather its failure for one particular GUT theory. This is the first I found using a quick google search for "GUT" and "coupling constant"; it is a common plot shown for papers on GUTs in general.

      Yes, interestingly enough, such plots showing how the electromagnetic, weak and strong couplings run to the same value at a given, fixed energy were actually used early on to argue that grand unification of the SU(5) type probably was occurring. There were errors in the original calculations though, and it turns out that the couplings don't actually become the same, they miss as shown in the plot you reference. That's one of the reasons why supersymmetric unification is proposed: you have more parameters and you can make the three couplings meet ... it's not really a very good argument if you think about it a bit!

      Its been a couple of years since I studied this stuff. I'd be interested to know if this article is pointing to something new theoretically.

      Yes it is certainly pointing to something new, if the result is correct. But there are many problems with it.

      There is the obvious unpleasantness that you don't really see a clean result: you can't really say that we see that spectral line X in atom Y was different at such and such a time and place than it is now. Instead you must look at a statistical analysis of many lines over many gas clouds, and extract a best guess on the value of alpha as a function of time (actually, redshift) in the universe, using a theoretical calculation of how atomic lines should depend on alpha. It has to be said that such a calculation is far from trivial to do for complex multi-electron atoms, and in fact can't even be done all that accurately.

      Indeed, people who did the same kind of observations previously, concentrated on alkali metal doublets which are produced in (effectively) simple single electron atoms, for which one has a much better hope of understanding the theory well. These observations produce a null result for the variation of alpha, it's only when you add in the more complex metals as these researchers do, that you see a non-zero effect.

      My feeling is that it is very interesting, but it's most likely that when a more extended analysis and more observations are done, there will probably be nothing left of the effect, except possibly a better null result than one had with the alkali doublets.

  • This whole constant thing not being a constant is BS..

    Everyone knows that constant's value is 42.

  • Alpha is not the only part of physics that may have to change. Special relativity may have to change as well, to be replaced with doubly special relativity. SR gets its name because one constant (the speed of light) is deemed special and must be viewed as the same value by all observers, this seems fully compatible with all the forces of nature except one, quantum gravity. The strength of gravity is measured by the constant, the Planck mass, with is the mass at which a black hole's event horizion is the same size as the wavelength of the black hole. However there is a problem with this: observers travelling at different speeds will disagree about the about the size of the Planck mass, and so some physicist, including ones with such high credentials as Lee Smolin, are beginning to believe a theory known as double special relativity in which both the speed of light and the planck mass are the same for all observers. You can see a few of the papers on it at the Los Almos archives:

    here [lanl.gov] and here [lanl.gov].

    Finally a changing speed of light is predicted in a DSR approach here [arxiv.org].

    • Relativity has nothing to do with a speed of light that changes, any more than geodesics have anything to do with the size of the earth.

      Let me explain. Both relativity and geodesics posit that we must modify the notion of space-time or space in the large scale, to deal with the fact that the "flat model of space/time" is inappropriate in some conditions.

      In geodesy, the size of the earth is a constant that you use to convert angles into length, and in relativity, the speed of light is something that you convert time to length.

      Assuming that you have other ways of recovering time and length from ancient sources, you can recover therefore an ancient speed of light.

      Actually, the book that the constants table comes from in the first place, goes into the physics of defining standards [including endlength prototypes], and their dependence on different universal constants, in some length.... The speed of light in this is not assumed constant...

  • This is interesting news.

    Since modern attempts to unify the fundamental physical forces began, gravity in particular has presented a difficulty for scientists, and it appears that the solution may be changes in constants we previously believed to be, well, constant.

    This could have far-reaching implications for the way we think about science, and especially our understanding of what science can tell us. It seems possible that our disciplines of science and natural history might actually be driven farther apart, as we lose any reliable base indicators on which to base assumptions about the past.

    For some in the scientific orthodoxy, this is anathema and they will fight it tooth and nail to the bitter end, for it forces them to accept a reality that they have long denied. The liberals constantly tell us that because of the relatively slow travel of light from distant galaxies, it must have been traveling for long periods of time, and the universe must therefore be quite old (billions and billions... you know the drill). Now their rationalizing will be laid bare and they must admit that the Bible has again withstood vigorous attempts at disproof, that they have a Creator and are therefore accountable to Him.
    • Oops. OK once again, properly formatted. Way to mess the post up.

      It seems possible that our disciplines of science and natural history might actually be driven farther apart, as we lose any reliable base indicators on which to base assumptions about the past.

      AFAIK natural history is science. Besides if you read the article you would realise we don't loose anything, because the experiment can show what alpha was. If you know how a constant has changed you can take it into account so your indicator is fine, although the maths becomes more complex.

      For some in the scientific orthodoxy, this is anathema and they will fight it tooth and nail to the bitter end, for it forces them to accept a reality that they have long denied. The liberals constantly tell us that because of the relatively slow travel of light from distant galaxies, it must have been traveling for long periods of time, and the universe must therefore be quite old (billions and billions... you know the drill).

      What has being a liberal got to do with anything? Can only liberals be scientists? Non-liberals must be Creationists? Not to mention all kinds of other methods of dating planets, stars, rocks and the like.

      Now their rationalizing will be laid bare and they must admit that the Bible has again withstood vigorous attempts at disproof, that they have a Creator and are therefore accountable to Him.

      And all logic breaks down. How do you get to this from a possible slight change in alpha? Lets assume that we find the constants do change over time and it overthrows current thinking on the creation of the universe it doesn't prove Creationism or a Creator.

      If you want to believe in Creationism as a matter of faith that's your choice. If you want to advocate it as science you need to do real science (work from evidence to conclusion, not backwards, actually have some evidence etc.) and simply attacking current theories doesn't really help.

      Creationism doesn't have magic win by default clause, disproving another theory (technically Creationism isn't even a theory, its a hypothesis) does nothing at all to prove Creationism or that the Bible is literal truth.

      Mant

  • I made a few minor corrections, please refer to errata sheet 204.3445.

    -The Almighty.

Waste not, get your budget cut next year.

Working...