Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Space Science

Milky Way Is Twice the Size We Thought 301

Peter writes to tell us about a research group at the University of Sydney in Australia, who in the middle of some calculation wanted to check the numbers everybody uses for the thickness of our galaxy at the core. Using data available freely on the Internet and analyzing it in a spreadsheet, they discovered in a matter of hours that the Milky Way is 12,000 light years thick, vs. the 6,000 that had been the consensus number for some time.
This discussion has been archived. No new comments can be posted.

Milky Way Is Twice the Size We Thought

Comments Filter:
  • A good reminder (Score:5, Interesting)

    by A beautiful mind ( 821714 ) on Wednesday February 20, 2008 @04:06AM (#22485314)
    This is a good reminder how you're supposed to dig down to the raw data and validate that. I remember reading in one of Richard Feynman's books about a similar case, some conclusion or data appeared well supported, because a lot of the research papers were supportive of the idea, but it turned out that they derived what they said based on a single source.

    The case here is similar, it's a good reminder how science is about data, validation and facts not about authority. You're supposed to check your data, check your facts and try to avoid making implicit assumptions.
  • skeptical (Score:2, Interesting)

    by timmarhy ( 659436 ) on Wednesday February 20, 2008 @04:19AM (#22485396)
    have they checked their freely available sources they found on the internets? seriously i'm dubious of everything claiming to use a spread sheet and/or internet sources these days.
  • Re:A good reminder (Score:3, Interesting)

    by Rocketship Underpant ( 804162 ) on Wednesday February 20, 2008 @04:25AM (#22485418)
    That reminds me of a famous scientist who was mentioning HIV in an article he was writing, and wanted to cite the original source where it was first discovered and published that HIV caused AIDS. He couldn't find it. No one else he talked to could either. It turns out that what is a common assumption (and perhaps true) has never actually been verified and published.
  • by I confirm I'm not a ( 720413 ) on Wednesday February 20, 2008 @04:35AM (#22485448) Journal

    To be fair to Wikipedia, they cite their source [wikipedia.org] for that claim. And the source is...

    ...(drumroll!)...

    NASA [nasa.gov]

  • by Anonymous Coward on Wednesday February 20, 2008 @04:45AM (#22485486)
    Well, if we're expecting that the universe is actually 75 to 95 percent dark matter based on the
    SAME KIND OF FLAWED DATA, perhaps we are underestimating the amount of matter we actually CAN see.

    I always wondered how exactly they determined how much matter was in the universe, indirect evidence or not.
    Seems like there may be few assumptive leaps there, upon which we build our entire cosmological understanding.

    If the 'missing' matter is actually regular matter that we haven't found, or have found and discounted,
    the search for dark matter will be even more in vain than it appears to be already. Can we stop looking?

  • file under pants (Score:4, Interesting)

    by tinkerton ( 199273 ) on Wednesday February 20, 2008 @04:46AM (#22485490)
    That famous scientist may have allowed himself to get carried away a bit. What it means is that there was no clean breakthrough article. Rather, evidence gradually accumulated. What it does not mean is that the connexion is "perhaps true", certainly not in the current stage where effective medicines exist.

    On the other hand it's good practice to have roundup articles that go over the evidence.
  • by pkphilip ( 6861 ) on Wednesday February 20, 2008 @05:20AM (#22485648)
    What I find disturbing is the fact that a number is this widely off and no one discovered it for such a long time! I can imagine deviation by x % or less where x

    The split of Humans from the Apes pushed back by another 6 to 7 million years earlier than previously thought based on molecular genetics. The difference from the earlier estimate of around 5 to 6 million years is therefore over 100%
    http://www.news24.com/News24/Technology/News/0,,2-13-1443_2169361,00.html [news24.com]
  • Actual paper? (Score:5, Interesting)

    by N7DR ( 536428 ) on Wednesday February 20, 2008 @06:09AM (#22485856) Homepage
    Does anyone know where the actual paper can be found? TFA is just a news release for the popular press. Going to the list of publications for the author of the study (http://www.physics.usyd.edu.au/~bmg/papers/) doesn't list anything that looks like it's the paper on which the news release is based.

    TFA says: "The team's results were presented in January this year at the 211th meeting of the American Astronomical Society in Austin, Texas." but there's no indication of where the results have actually been published in a peer-reviewed journal so that one could read the paper for oneself. I looked on the AAS site and couldn't find anything there either. So, pending access to a detailed published per-reviewed account of their work, I'm reserving judgement as to how valid the claim is.

  • by Anonymous Coward on Wednesday February 20, 2008 @06:43AM (#22486000)
    Don't you think that it is strange that the number was multiplied by 2 (from 6000 to 12000) ? How convenient!
    Why not find a number such as 13100 or 9884? This number is just too clean to be true. It's only value way be to show that our last estimation was way wrong, but it doesn't tell how false is this new estimation...
  • by Jesus_666 ( 702802 ) on Wednesday February 20, 2008 @07:27AM (#22486268)

    Asking a serious question on slashdot. At night. Clearly.
    So, if I obfuscate my question I'm going to get better results?

    All right, here we go:
    Given a number of N chain reactions, H in which heated matter closely resembling our current model of isotopes of low-proton proton/neutron agglomerates and their corresponding companion electrons forms new such matter with the number of the protons per individual agglomerate increasing and being somewhere between one and 26 and under the condition that between individual such reactions there is a large amount of space and under the assumption that each such reaction is large enough to maintain itself for an amount of time significantly longer, most probably by orders of magnitude, than 375.7 fortnights and three days and under the assumption that each such reaction has an attractive force on everything else in the universe with a power diminishing with increasing distance and under the assumption that this attractive force causes the reactions to form clusters, L, which might be rotating or not and under the assumption that dense clusters of matter, K, circle each such reaction and under the assumption that
    $\exists c \in K: \exists q \in L: \exists ö \in H: ö \in q \wedge$ c circles ö $\wedge$ c is Earth,
    how exactly is defined the magnitude of c's expansion in those dimenions of bosonic string theory that can be accurately described with the C datatype given in appendix 5?

    Appendix 1:
    Intentionally left blank.
    Appendix 1:
    Intentionally left blank.
    Appendix 2:
    Intentionally left blank.
    Appendix 3:
    Intentionally left blank.
    Appendix 4:
    struct space_dimensions {
    double x;
    double y;
    double z;
    double d4;
    double d5;
    double d6;
    double d7;
    double d8;
    double d9;
    double d10;
    double d11;
    double d12;
    double d13;
    double d14;
    double d15;
    double d16;
    double d17;
    double d18;
    double d19;
    double d20;
    double d21;
    double d22;
    double d23;
    double d24;
    double d25;
    };

    Appendix 5:
    struct sdim{long double h;long double w;long double d;};
  • by ta bu shi da yu ( 687699 ) * on Wednesday February 20, 2008 @08:26AM (#22486620) Homepage
    I'd say it's insightful because a. there are a lot of scientists interested in this sort of thing, and b. the calculation has been around for quite some time with noone challenging it.

    I thought scientists were meant to challenge conventional wisdom? The parent poster is only saying that in his/her opinion it took far too long for this one to be tested again.
  • by boot_img ( 610085 ) on Wednesday February 20, 2008 @08:43AM (#22486720)
    ... is shown right here on this slashdot discussion.

    I am an astronomer, so first some background: The Milky way has several components: young stars, old stars, dust and various components of gas. They all have different thicknesses. There is no single "thickness". One of these components (warm ionized gas) has been measured to have a thickness larger than expected. This measurement has not been confirmed by others, nor (I think) published yet.

    Despite this complexity, this discussion thread is awash with arguments, confusion, wild speculation, suggestions that dark matter might be wrong etc. etc. OK, fine, this is slashdot, that's what slashdot is for.

    But the same people (presumably) have also rushed off to edit Wikipedia! (I see a half dozen edits this morning, to add in the "new" thickness.) That's the part that I find incredible. And people really take Wikipedia seriously?

  • by pkphilip ( 6861 ) on Wednesday February 20, 2008 @09:42AM (#22487118)
    I agree that the nature of science is that we will definitely need to improve on our findings and get higher and higher levels of accuracy. That is to be expected.

    What I find worrying is the range of correction that needs to be applied and also the fact that the correction takes this long - especially considering that the group was able to arrive at a value which is *twice* the older value by just spending a little bit of time studying the data.

    The questions it raises are:

    1. How is it that the Milkyway was considered to be 6000 light years wide? When someone made this claim, wasn't the data ever rechecked by anyone? If someone with a spreadsheet can come up with this new value of 12000 light years just by spending a few hours studying it, why was it not done earlier? What happened to peer-review - was it ever conducted? If this isn't an indication of incompetence at some level among a few people involved in setting this value, what is?

    2. Scientific findings will, no doubt, be modified as new things come to light. However, corrections are normally meant to be just a few % off the initial value. 100% change is not an improvement - it means that the initial value was astoundingly and absolutely wrong. What is staggering about this is the fact that the new value was not calculated based on any *new* finding - but rather it was found just by recalculating based on the *already* existing data.

    3. What implications does this have on other findings?

    My example about the dating of primate and human evolution was to prove that these type of huge "corrections" have occured even in other scientific fields as well. So what we know to be absolutely true today, can be completely off tomorrow.
  • by Anonymous Coward on Wednesday February 20, 2008 @02:07PM (#22490940)
    Wow, you guys are awesome about browbeating people into perceived responsibilities.

    Here's a few clues:

    1) This guy has zero responsibility to go be an article editor.
    2) If he chooses not to, that doesn't make the information less wrong.
    3) Fixing a problem after the fact is not the same thing as having no problem at all.
    4) Wasted effort makes the baby Spaghetti Monster cry.

    IOW, it's not OK to leave a shitty design or bugs in your app just because someone else can fix them for you. And if you don't know what you're talking about, keep your hands off the Wiki.

If you want to put yourself on the map, publish your own map.

Working...