Forgot your password?
typodupeerror
Science

Dark Matter Filament Finally Found 190

Posted by samzenpus
from the there-it-is dept.
An anonymous reader writes "Everyone is talking about the recent Higgs boson announcement by the scientists at CERN, but another significant scientific discovery was revealed this week as well. In a study published online in the journal Nature on Wednesday, scientists show that they have successfully found the first dark matter filament."
This discussion has been archived. No new comments can be posted.

Dark Matter Filament Finally Found

Comments Filter:
  • by lgw (121541) on Thursday July 05, 2012 @09:05PM (#40559165) Journal

    Non-luminous normal matter absorbs light (and so becomes luminous normal matter evenutally, at least at some frequency).

    BTW, the confirmation for dark matter vs other theories for galaxy rotation came from the WMAP [wikipedia.org] data. IIRC, about 80% of the early matter of the universe was shown to be somehting that interacted gravitationally, but did not interact with light (or electrons). The actual % of dark matter measured matched the amount predicted by the dark matter hypothesis for galaxy rotation rates, which is a pretty convincing confirmation IMO.

  • by Nom du Keyboard (633989) on Thursday July 05, 2012 @09:26PM (#40559273)

    I'd call this bigger than the Higgs.

  • by Areyoukiddingme (1289470) on Thursday July 05, 2012 @10:08PM (#40559463)

    It seems to me that conclusions based on lensing effects are making a rather large assumption about the homogeneity of both interstellar and intergalactic space. The interstellar medium especially is supposed to be composed of so many atoms per cubic meter, and the assumption is those atoms are almost exclusively hydrogen and are evenly distributed

    That may be an unwarranted assumption.

    We've been staring at the sky for a long time now, but only recently have we been doing very large sky surveys, and only very recently have we had the processing power available to do something useful with wide swaths of that data at once. Seems to me there might be some Ph.D's to be had in using data from things like the Sloan Sky Survey to try to validate assumptions about the interstellar medium in greater detail. There may be thin filaments (on interstellar scales) of finely distributed matter that are denser than the overall medium. Or less dense. Or clumps. Nebulas are typically very diffuse. Might it be possible for there to be nebula-like formations that are even more diffuse? So diffuse that they appear largely transparent to most frequencies? So diffuse that their only affect on light is lensing?

    I remember astronomers locating nebulas that were previously invisible because they don't emit visible light, but do emit in other parts of the spectrum. The explanation was they are older, cooler formations. But they don't just vanish as they continue to age. That gas is still around, getting ever cooler and more diffuse. Considering how much nova and supernova debris we've already identified in the galaxy, it doesn't seem too big of a leap to consider the long term (as in gigayears) ramifications of nova debris on the general interstellar medium.

  • by arth1 (260657) on Friday July 06, 2012 @03:02AM (#40560973) Homepage Journal

    Yes, I get a kick out of how that article, as well as the one on space.com linked to above, are both written under the assumption that we know "dark matter" exists... but we know no such thing. It is still a matter of much controversy (no pun intended).

    We have various theories to account for the observations. Among them the most popular of the string theories, which support the existence of dark matter. But on the other hand, there have been a number of recent findings that call "string theory" itself into strong question. Perhaps even rendering it invalid.

    Much hinges upon whether the true God particle, the gravitron, really exists. If it does, it would shake up the standard model. If it doesn't, it would shake up the standard model.
    Safest right now is to sometimes believe in it, and treat its existence as as unfalsifiable as God, while having a drink at the multi-dimension (including string theory) bar.

    In short, we are a tad short on understanding how mass and gravity really interact, and the implications. Which dark matter hinges on - both whether and what.

  • by arth1 (260657) on Friday July 06, 2012 @03:14AM (#40561043) Homepage Journal

    Un-doing 7 well-deserved mod points to post this, so pay attention. Higgs was not a given. A particle in the same range without the ability to generate the Higgs field was also a possibility.

    Correct me if I'm wrong, but I think a pairing between a Z and W boson was also considered a candidate.

    And also if assuming the Higgs' boson, the question was whether it was in the 120-130 GeV or in the ~182 GeV range - the energy difference could have significant impacts on the standard model, especially in higher order Higgs (when it interacts with itself), but also in how rare the sub-particle would be, and in predicting where to find the last couple of missing particles (not counting the elusive Gravitron).
    All in all, the LHC discovery, although predicted, is a great discovery that will give physicists data they sorely needed.

    Dark matter? Not so much. We know there are unobservable gravitational effects, but we can't currently say what they are even if we can point to a place where they are. Nailing the Higgs' boson may, in the future, help with this, but not yet.

  • by spike hay (534165) <blu_ice&violate,me,uk> on Friday July 06, 2012 @11:20AM (#40564459) Homepage

    Looking at "six sigma" is stupid. If you are talking about the management fad, it assumes the data follows a normal distribution. Generally, frequentist statistics is misleading. It's not wrong, but it is very commonly used improperly. For example, if you hear that a null hypothesis that the mean of a distribution is less than zero, H0:mu H0 is true, and that the data follow the assumed distribution, 99% of the sample means you get would be less than 0.

    This article actually uses Bayesian statistics (samples the posterior PDF using MCMC), rather than frequentist.

Any given program, when running, is obsolete.

Working...