Psychology's Replication Battle 172
An anonymous reader sends this excerpt from Slate:
Psychologists are up in arms over, of all things, the editorial process that led to the recent publication of a special issue of the journal Social Psychology. This may seem like a classic case of ivory tower navel gazing, but its impact extends far beyond academia. ... Those who oppose funding for behavioral science make a fundamental mistake: They assume that valuable science is limited to the "hard sciences." Social science can be just as valuable, but it's difficult to demonstrate that an experiment is valuable when you can't even demonstrate that it's replicable. ...Given the stakes involved and its centrality to the scientific method, it may seem perplexing that replication is the exception rather than the rule. The reasons why are varied, but most come down to the perverse incentives driving research. Scientific journals typically view "positive" findings that announce a novel relationship or support a theoretical claim as more interesting than "negative" findings that say that things are unrelated or that a theory is not supported. The more surprising the positive finding, the better, even though surprising findings are statistically less likely to be accurate."
"less likely to be accurate" (Score:4, Funny)
That's a surprise.
Re:Easy to measure versus important (Score:4, Funny)
should both validate the idea
Over the years we've heard that a good Waterfall process was the magic bullet with Data Flow Diagrams documenting everything before a line of code is written.. . No wait, it's Object Oriented Analysis/Design that will save the day...but no, that didn't work either - but Service Oriented Architecture is the way to go. The latest fad is whatever book sold well recently; none of it is based on any metrics or real science.