
Nobel Winner Schekman Boycotts Journals For 'Branding Tyranny' 106
An anonymous reader writes "One of this year's winners of the Nobel Peace prize has declared a boycott on leading academic journals after he accused them of contributing to the 'disfigurement' of science. Randy Schekman, who won the Nobel Prize in physiology or medicine, said he would no longer contribute papers or research to the prestigious journals, Nature, Cell and Science, and called for other scientists to fight the 'tyranny' of the publications." And if you'd rather not listen to the sound of auto-playing ads, you'll find Schekman's manifesto at The Guardian.
Re:crossing fingers. (Score:5, Interesting)
I'm not so sure what he is complaining about is a big problem because not too many places can keep chasing the fad topics. To keep your lab alive, you need to establish some kind of expertise. It is after you've set up a self-sustaining lab that you can afford to repeatably chase after the hot topic du jour. In other words, you've most likely got your tenure.
I can't say how familiar I am with the machinations of those particular journals, but I think most of the blame for the things that cause the issues you mention lie with the colleges and universities who put so much emphasis on publication counts and impact factors.
An interesting aside, to me at least, is that I only recently installed Ghostery and when I went to the article linked in the summary, I was notified of 88 different tracking entities that were blocked. Eighty-eight on one web page!
Re:crossing fingers. (Score:4, Interesting)
You can't really determine who is the best researcher by understanding the quality of the research. If you have 50 grant applications and 10 grants to award, how do you decide who to give it to? Are you going to read through the entire research of all 50 to determine who is the best researcher in the weekend you're given to determine who gets funded?
An adjusted citation index would probably be the best option, but that gets back to the top journals, which are more likely to be read and cited than lesser impact journals, so you arrive back at comparing where one has published. Perhaps citation indexes should be adjusted to factor out the journal brand name effect, but that won't ever happen since it would be penalizing the current top researchers who have the reigns. And it's probably a stupid idea anyway.
Cronyism is the preferred alternative to looking at where one has published, but obviously that has it's problems and is worse than simply looking at journal brand name. Although whether you get published in a great journal often depends on cronyism as well.
So all the realistic options are shitty.
Re:crossing fingers. (Score:4, Interesting)
I can't say how familiar I am with the machinations of those particular journals, but I think most of the blame for the things that cause the issues you mention lie with the colleges and universities who put so much emphasis on publication counts and impact factors.
It's a symbiotic network of publications, promotions, and grant awards -- and those journals are one of the core components of the network. Those journals are not just passive beneficiaries of this system, they actively promote their role in the system (by publicizing their impact factor, for instance). On top of that, these journals have made some major mistakes. I could add more examples to Sheckman's list of bad publications. They are not being responsible powerholders, therefore it is urgent that we remove their power.
I think Sheckman's point is to break the link between "high profile" work and those journals, so that universities cannot use publication in those journals as a proxy for work being interesting.