Milky Way Is Twice the Size We Thought 301
Peter writes to tell us about a research group at the University of Sydney in Australia, who in the middle of some calculation wanted to check the numbers everybody uses for the thickness of our galaxy at the core. Using data available freely on the Internet and analyzing it in a spreadsheet, they discovered in a matter of hours that the Milky Way is 12,000 light years thick, vs. the 6,000 that had been the consensus number for some time.
A good reminder (Score:5, Interesting)
The case here is similar, it's a good reminder how science is about data, validation and facts not about authority. You're supposed to check your data, check your facts and try to avoid making implicit assumptions.
skeptical (Score:2, Interesting)
Re:A good reminder (Score:3, Interesting)
Re:Wikipedia says 1000 (Score:5, Interesting)
To be fair to Wikipedia, they cite their source [wikipedia.org] for that claim. And the source is...
...(drumroll!)...
NASA [nasa.gov]
Re:A good reminder - Disproval of dark matter? (Score:2, Interesting)
SAME KIND OF FLAWED DATA, perhaps we are underestimating the amount of matter we actually CAN see.
I always wondered how exactly they determined how much matter was in the universe, indirect evidence or not.
Seems like there may be few assumptive leaps there, upon which we build our entire cosmological understanding.
If the 'missing' matter is actually regular matter that we haven't found, or have found and discounted,
the search for dark matter will be even more in vain than it appears to be already. Can we stop looking?
file under pants (Score:4, Interesting)
On the other hand it's good practice to have roundup articles that go over the evidence.
Other instances of numbers widely off (Score:3, Interesting)
The split of Humans from the Apes pushed back by another 6 to 7 million years earlier than previously thought based on molecular genetics. The difference from the earlier estimate of around 5 to 6 million years is therefore over 100%
http://www.news24.com/News24/Technology/News/0,,2-13-1443_2169361,00.html [news24.com]
Actual paper? (Score:5, Interesting)
TFA says: "The team's results were presented in January this year at the 211th meeting of the American Astronomical Society in Austin, Texas." but there's no indication of where the results have actually been published in a peer-reviewed journal so that one could read the paper for oneself. I looked on the AAS site and couldn't find anything there either. So, pending access to a detailed published per-reviewed account of their work, I'm reserving judgement as to how valid the claim is.
it is not 12000 either! (Score:1, Interesting)
Why not find a number such as 13100 or 9884? This number is just too clean to be true. It's only value way be to show that our last estimation was way wrong, but it doesn't tell how false is this new estimation...
Re:Is this real information? (Score:2, Interesting)
All right, here we go:
Given a number of N chain reactions, H in which heated matter closely resembling our current model of isotopes of low-proton proton/neutron agglomerates and their corresponding companion electrons forms new such matter with the number of the protons per individual agglomerate increasing and being somewhere between one and 26 and under the condition that between individual such reactions there is a large amount of space and under the assumption that each such reaction is large enough to maintain itself for an amount of time significantly longer, most probably by orders of magnitude, than 375.7 fortnights and three days and under the assumption that each such reaction has an attractive force on everything else in the universe with a power diminishing with increasing distance and under the assumption that this attractive force causes the reactions to form clusters, L, which might be rotating or not and under the assumption that dense clusters of matter, K, circle each such reaction and under the assumption that
$\exists c \in K: \exists q \in L: \exists ö \in H: ö \in q \wedge$ c circles ö $\wedge$ c is Earth,
how exactly is defined the magnitude of c's expansion in those dimenions of bosonic string theory that can be accurately described with the C datatype given in appendix 5?
Appendix 1:
Intentionally left blank.
Appendix 1:
Intentionally left blank.
Appendix 2:
Intentionally left blank.
Appendix 3:
Intentionally left blank.
Appendix 4:
struct space_dimensions {
double x;
double y;
double z;
double d4;
double d5;
double d6;
double d7;
double d8;
double d9;
double d10;
double d11;
double d12;
double d13;
double d14;
double d15;
double d16;
double d17;
double d18;
double d19;
double d20;
double d21;
double d22;
double d23;
double d24;
double d25;
};
Appendix 5:
struct sdim{long double h;long double w;long double d;};
Re:Other instances of numbers widely off (Score:4, Interesting)
I thought scientists were meant to challenge conventional wisdom? The parent poster is only saying that in his/her opinion it took far too long for this one to be tested again.
The problem with Wikipedia (Score:5, Interesting)
I am an astronomer, so first some background: The Milky way has several components: young stars, old stars, dust and various components of gas. They all have different thicknesses. There is no single "thickness". One of these components (warm ionized gas) has been measured to have a thickness larger than expected. This measurement has not been confirmed by others, nor (I think) published yet.
Despite this complexity, this discussion thread is awash with arguments, confusion, wild speculation, suggestions that dark matter might be wrong etc. etc. OK, fine, this is slashdot, that's what slashdot is for.
But the same people (presumably) have also rushed off to edit Wikipedia! (I see a half dozen edits this morning, to add in the "new" thickness.) That's the part that I find incredible. And people really take Wikipedia seriously?
Re:Other instances of numbers widely off (Score:5, Interesting)
What I find worrying is the range of correction that needs to be applied and also the fact that the correction takes this long - especially considering that the group was able to arrive at a value which is *twice* the older value by just spending a little bit of time studying the data.
The questions it raises are:
1. How is it that the Milkyway was considered to be 6000 light years wide? When someone made this claim, wasn't the data ever rechecked by anyone? If someone with a spreadsheet can come up with this new value of 12000 light years just by spending a few hours studying it, why was it not done earlier? What happened to peer-review - was it ever conducted? If this isn't an indication of incompetence at some level among a few people involved in setting this value, what is?
2. Scientific findings will, no doubt, be modified as new things come to light. However, corrections are normally meant to be just a few % off the initial value. 100% change is not an improvement - it means that the initial value was astoundingly and absolutely wrong. What is staggering about this is the fact that the new value was not calculated based on any *new* finding - but rather it was found just by recalculating based on the *already* existing data.
3. What implications does this have on other findings?
My example about the dating of primate and human evolution was to prove that these type of huge "corrections" have occured even in other scientific fields as well. So what we know to be absolutely true today, can be completely off tomorrow.
Re:The problem with Wikipedia (Score:1, Interesting)
Here's a few clues:
1) This guy has zero responsibility to go be an article editor.
2) If he chooses not to, that doesn't make the information less wrong.
3) Fixing a problem after the fact is not the same thing as having no problem at all.
4) Wasted effort makes the baby Spaghetti Monster cry.
IOW, it's not OK to leave a shitty design or bugs in your app just because someone else can fix them for you. And if you don't know what you're talking about, keep your hands off the Wiki.