The Journal of Serendipitous and Unexpected Results 153
SilverTooth writes "Often, when watching a science documentary or reading an article, it seems that the scientists were executing a well-laid out plan that led to their discovery. Anyone familiar with the process of scientific discovery realizes that is a far cry from reality. Scientific discovery is fraught with false starts and blind alleys. As a result, labs accumulate vast amounts of valuable knowledge on what not to do, and what does not work. Trouble is, this knowledge is not shared using the usual method of scientific communication: the peer-reviewed article. It remains within the lab, or at the most shared informally among close colleagues. As it stands, the scientific culture discourages sharing negative results. Byte Size Biology reports on a forthcoming journal whose aim is to change this: the Journal of Serendipitous and Unexpected Results. Hopefully, scientists will be able to better share and learn more from each other's experience and mistakes."
A great idea (Score:5, Interesting)
Re:A great idea (Score:5, Interesting)
Re:A great idea (Score:4, Interesting)
Re:A great idea (Score:5, Interesting)
Repeating the study on a different population and failing to find a significant result can also show that the results don't generalise to that population.
Re:A great idea (Score:5, Interesting)
I agree (my first paper was a negative-results paper), but I think there are some kinds of negative results that are relatively hard to get published. Papers along the lines of: "here's an approach you might have thought would work, but it turned out that it didn't, and in retrospect we can see why, which this paper will explain". If you try to submit a paper like that, you often get push-back of, "oh well, yeah it's obvious why that wouldn't work, dunno why you didn't see it earlier". And of course it often is obvious once you've read why it doesn't work.
As you point out, it's quite a bit easier to get negative results published if someone else had already claimed them as positive results. In that case, you're not both proposing and shooting down the idea simultaneously, but shooting down (or failing to confirm) someone else's idea, which has the advantages that: 1) you have evidence that at least one presumably smart person really didn't think it was obviously a bad idea (in fact, they thought it was a good one, and even that it worked); and 2) you're positioned as correcting an error in the literature, rather than as introducing a correction for a hypothetical error nobody has yet made.
It's a bit tricky to fix, because some negative results really are obvious: it does nobody in the field any good to publish "we tried X on Y, and it didn't work", if genuinely nobody who was competent in the field would've thought X would work on Y, and the reason was exactly the reason you discovered.
Incidentally, here's [uottawa.ca] one previous attempt to start such a journal that didn't really get off the ground. Their one published article, which is quite good, is of the form I mention: the authors of a system called Swordfish recounted an idea they had to produce an improvement, Swordfish2, that in the end turned out to do be better than the original Swordfish. It was hard to get published elsewhere, because it wasn't correcting an existing result---nobody had previously proposed that doing what they tried to Swordfish2 was actually a good idea---but it's interesting (to me, at least) because it really does seem like a plausible idea, and I feel I learned something in reading why it didn't work.
Re:Fantastic idea (Score:3, Interesting)
using statistical analysis developed by economists
Funny, given recent events I would be more worried about the economists models.
Re:A great idea (Score:4, Interesting)
In my experience it has nothing to do with egos or competition.
But it is damn hard to publish something that doesn't work!
I was recently involved i developing a microfluidic system for diagnostics. Every milestone and sub-problem was solved. But when the final injection molded devices were tested, they failed due to an sort of interesting non-obvious combination of factors. Two issues with publishing this; the problems were very specific to our system and the conclusion could be written in 5 lines of text.
It would have been like a movie with huge setup, but within the first 3 minutes the hero stumble, break his neck, and dies. End credits. It was a EU founded research project, no more money no more time. You can't get founding to continue a failed project. End of story.
But my point is, in all my experience as scientist. I've never seen one of my colleagues say "we should hide this", but I've often heard "I would like to tell about this, but I don't know of a paper that would accept it".
Also when something fails we need to carry on, but now we're behind schedule...
Re:A great idea (Score:3, Interesting)
Is that a so good idea ? (Score:3, Interesting)
I can't help also wonder if this is a good use of "peer reviewing" which has a kind of shortage, or so I heard.
conferences and informal communication help (Score:3, Interesting)
In addition there is often a lot of benefit in working things out for yourself - this provides the in depth understanding to base deeper work on which can be lacking if merely following instructions...