Nobel Winner Schekman Boycotts Journals For 'Branding Tyranny' 106
An anonymous reader writes "One of this year's winners of the Nobel Peace prize has declared a boycott on leading academic journals after he accused them of contributing to the 'disfigurement' of science. Randy Schekman, who won the Nobel Prize in physiology or medicine, said he would no longer contribute papers or research to the prestigious journals, Nature, Cell and Science, and called for other scientists to fight the 'tyranny' of the publications." And if you'd rather not listen to the sound of auto-playing ads, you'll find Schekman's manifesto at The Guardian.
crossing fingers. (Score:5, Insightful)
Re: (Score:2)
This is what happens when Academia becomes beholden to commercial interests.
Shenkman (Score:2)
I preferred him in the Ghostbusters movies.
TFA's beef is with journal "prestige" (Score:5, Insightful)
IAAPGS
FWIW, while Cell and Nature are both owned by private companies, Science is run by a non-profit (the American Association for the Advancement of Science), and articles in science are made freely available two years after publication.
Having read his manifesto, I don't think his issue with with corporate publishers per se. His issue is with the culture of judging the quality of work by the prestige of the journal it was published in. That allows journals to further exploit the process; they have a large incentive to publish flashy research rather than quality research, because flashy research gets more citations -- thus making the journal more prestigious.
While I agree this is a flawed system, I'm not convinced that open-access journals are the solution; there are already more prestigious open access journals -- like Physical Review X and the New Journal of Physics (both of which are run by non-profits with prestigious, closed-access journals).
To some extent, you need both flash and quality research. I'm sure someone could do quality research on the physics of navel lint trapping, but pretty much no one would care; the research isn't interesting, and it wouldn't be worth the effort to peer review. So, for better or worse, I don't think the flashy factor will or should totally go away, although I agree it should be reduced.
That said, I am a fan of open-access journals, but I need something to publish first. I guess I should get back to research and stop wasting time with Slashdot posts....
Re: (Score:1)
IAAPGS
FWIW, while Cell and Nature are both owned by private companies, Science is run by a non-profit (the American Association for the Advancement of Science), and articles in science are made freely available two years after publication.
This free access requires registration and only includes content from 1997 forward
While something is better than nothing, locking down the first 117 years of content unless you pay for the backfile isnt exactly researcher friendly
I wish more publishers would follow the lead of PNAS/American Heart Assoc/American Society of Microbiology
All backfile content open and relatively short embargos on new content (6-12months)
Re: (Score:3, Informative)
Commercial interests have nothing to do with this (at least, they are far removed).
Most biology research is funded by the federal government, and grant funding rates have gotten very low (meaning that it is very competitive and reviewers are looking for shortcuts).
Likewise, the big research universities (the most prestigious jobs) are non-profit, or even state run... and they evaluate their faculty in large part on their ability to get grant funding.
Re:crossing fingers. (Score:5, Interesting)
I'm not so sure what he is complaining about is a big problem because not too many places can keep chasing the fad topics. To keep your lab alive, you need to establish some kind of expertise. It is after you've set up a self-sustaining lab that you can afford to repeatably chase after the hot topic du jour. In other words, you've most likely got your tenure.
I can't say how familiar I am with the machinations of those particular journals, but I think most of the blame for the things that cause the issues you mention lie with the colleges and universities who put so much emphasis on publication counts and impact factors.
An interesting aside, to me at least, is that I only recently installed Ghostery and when I went to the article linked in the summary, I was notified of 88 different tracking entities that were blocked. Eighty-eight on one web page!
Re:crossing fingers. (Score:5, Insightful)
Re:crossing fingers. (Score:5, Funny)
Re: (Score:2)
Re: (Score:1)
What field are you talking about. In Sheckman's field (basic biomedical research), startup packages tend to carry you for a few years, and the core funding mechanism is a 5 year grant (NIH R01)
Re: (Score:2)
Re: (Score:2)
Or the results are not what someone wanted to see, so they can the project.
Re:crossing fingers. (Score:4, Interesting)
I can't say how familiar I am with the machinations of those particular journals, but I think most of the blame for the things that cause the issues you mention lie with the colleges and universities who put so much emphasis on publication counts and impact factors.
It's a symbiotic network of publications, promotions, and grant awards -- and those journals are one of the core components of the network. Those journals are not just passive beneficiaries of this system, they actively promote their role in the system (by publicizing their impact factor, for instance). On top of that, these journals have made some major mistakes. I could add more examples to Sheckman's list of bad publications. They are not being responsible powerholders, therefore it is urgent that we remove their power.
I think Sheckman's point is to break the link between "high profile" work and those journals, so that universities cannot use publication in those journals as a proxy for work being interesting.
Re: (Score:1)
Part of the point of tenure is to be secure enough to not have to chase after the topic du jour, but to other topics.
Re: (Score:2)
It's pretty simple. Publish online.
Re:crossing fingers. (Score:4, Insightful)
Re:crossing fingers. (Score:5, Insightful)
What randoms unidentified bloggers think about publishing has no bearing whatsoever on what scientists think about scientific publishing. Publishing online does not necessitate that peer review be dispensed with. I've not ever met an academic, be it in the sciences or elsewhere, who ever argued that print peer-reviewed publications should be replaced by online publications that are not peer reviewed.
You're attacking a strawman.
Re: (Score:3)
I've not ever met an academic, be it in the sciences or elsewhere, who ever argued that print peer-reviewed publications should be replaced by online publications that are not peer reviewed.
Strictly speaking, this is correct, but there certainly are serious scientists arguing for peer review taking place after publication, not before - under this scheme, we would simply post our raw manuscripts online (i.e. arXiv or similar server). But the ultimate goal is to have more peer review, not less, with the part
Re: (Score:2)
That sort of system could work quite well, but you'd need some commonly accepted system whereby you'd know when a paper had been peer reviewed by some reasonable number of reveiwers. arXiv or similar could certainly provide that service easily enough, as well as some way to know that the reviewers were in fact reasonable choices themselves, but it all runs the risk of becoming dependent on a new central authority.
Re:crossing fingers. (Score:5, Informative)
While journals are not perfect, they do (usually) maintain some minimum bars and filters for the material that goes into them.
"Journals published by Elsevier, Wolters Kluwer, and Sage all accepted my bogus paper." [retractionwatch.com]
Re:crossing fingers. (Score:4, Insightful)
The problem is that Schekman's argument is off base.
From the article (yes, I read it):
"These luxury journals are supposed to be the epitome of quality, publishing only the best research. Because funding and appointment panels often use place of publication as a proxy for quality of science, appearing in these titles often leads to grants and professorships."
His argument appears to revolve around these three high-impact journals serving as the gate keepers of "good" science. But his ire is misdirected. If funding and appointment panels are giving undue weight to publications in these journals, then THE PROBLEM LIES WITH THE FUNDING AND APPOINTMENT PANELS, not the journals.
His argument is paramount to "Scientists shouldn't publish in these journals because they're too highly regarded."
Re: (Score:2)
His argument is paramount to "Scientists shouldn't publish in these journals because they're too highly regarded."
Perhaps those trees are overwatered, and some others could use a little love.
Re:crossing fingers. (Score:4, Interesting)
You can't really determine who is the best researcher by understanding the quality of the research. If you have 50 grant applications and 10 grants to award, how do you decide who to give it to? Are you going to read through the entire research of all 50 to determine who is the best researcher in the weekend you're given to determine who gets funded?
An adjusted citation index would probably be the best option, but that gets back to the top journals, which are more likely to be read and cited than lesser impact journals, so you arrive back at comparing where one has published. Perhaps citation indexes should be adjusted to factor out the journal brand name effect, but that won't ever happen since it would be penalizing the current top researchers who have the reigns. And it's probably a stupid idea anyway.
Cronyism is the preferred alternative to looking at where one has published, but obviously that has it's problems and is worse than simply looking at journal brand name. Although whether you get published in a great journal often depends on cronyism as well.
So all the realistic options are shitty.
Journals are a symptom, not a cause (Score:5, Insightful)
Either make tenure easier to get so that professors are less likely to pursue fad or headline-grabbing science in order to achieve it, or encourage more grants to scientists that aren't affiliated with particular schools, so that they don't have to dance for their boards...
Unfortunately most major companies aren't conducting basic research like IBM, Xerox, Bell, and other big organizations did fifty+ years ago, so getting grants from big entities is harder than it once was.
Re:Journals are a symptom, not a cause (Score:5, Insightful)
According to Schekman's argument, journals --- specifically the highest-impact-factor "luxury" journals --- do play a causal rather than merely symptomatic role in the process. Such journals court papers that are "flashy," which will get lots of citations and attention (thus lots of journal subscriptions), possibly because they are wrong and focused more on attention-getting controversial claims than scientific rigor. This provides feedback on the other side of the tenure-seeking "publish or perish" culture to shape what sort of articles the tenure-seeking professors are pressured to churn out. If a scientist wants to establish their reputation by publishing ground-breaking, exciting discoveries, there's nothing a-priori wrong with that; the failure comes when joined with impact-factor-seeking journals applying distorted lower standards for scientific rigor for "attention-getting" work (while rejecting solid but "boring" research papers).
Re: (Score:2)
So which prize did he win? (Score:5, Informative)
Re: (Score:3, Funny)
The moon in your eye,
like a big Peace-a Prize.
No, I guess not.
Re: (Score:2)
It's time to start a boycott of Slashdot. These summaries are getting too bad to ignore. Weeklong hours? Peace prize in physiology or medicine?
I might have to resosrt to doing work to pass the day.
Re: (Score:1)
You could just work on installing a spell checker. That might keep you off the streets for a while....
Don't do anything you would regret for the rest of your life.
Re: (Score:1)
Maybe it's part of the branding.
Francis Crick once remarked that people who met him for the first time sometimes said, "Oh, I thought your name was Watson-Crick".
Contradiction (Score:5, Informative)
I don't know why I need to point this out, but the Nobel Peace Prize and the Nobel Prize for Physiology or Medicine are not the same thing. Schekman has only won the latter, not the former.
Re: (Score:2)
"Only" here limiting the number of Nobel prizes won, not modifying the perceived prestige of the prize.
Re: (Score:2)
Neither I nor the article's summary imply anything about simultaneity.
Re: (Score:2)
Or you lack the ability to read.
Re: (Score:2)
"One of this year's winners of the Nobel Peace prize"
Taking into account that only about 1/4 of peace prices are shared and that this year's was not, that sentence makes the post wrong from the first one or two words.
I just wanted to point out that this might be a record. A feat deserving 2013's "Fastest Error Award", also called Nobel Peace Price by some.
Re: (Score:2)
Re: (Score:1)
No.
Seek help.
Re:so... (Score:5, Informative)
He does continue to keep contributing --- to online, open-access journals without the adverse motivations of the "luxury brand" publishers. This way, alternative journals get to build the reputation of attracting top scientists and publishing good-enough-for-a-Nobel-prize-winner research, which can help change the perceptions that make publication in the "luxury brand" journals necessary for scientific careers.
Re: (Score:2)
I'm going to do my part. I promise never to send a manuscript to Nature or Science.
What about you?
Re: (Score:2)
I'm way ahead of you --- I haven't submitted any manuscripts to Nature or Science ever --- I'm decades ahead on this boycott thing! ;)
But, yes, I'll try to take ethical considerations like this into consideration when publishing (so far as I have the ability to convince other co-authors). I also happen to tend more towards "precision measurement" experiments than "big flashy discovery," so my research is usually safely on the "boring but, I hope, solid" side anyways (that wouldn't be aimed at a "luxury bran
Re: (Score:1)
So, Nobel prize winner declares, one day before accepting the prize, that we should all boycott the journals that got his research the recognition required to win the Nobel. Instead, he suggests people contribute to open access journals, like eLife, for which he is editor-in-chief
He may have some valid points, and god knows there is something very wrong with academic publishing today, but this could not sound more cynical. Seriously: "these journals were great for my career up to today, but now you should
Re: (Score:2)
Re:so... (Score:5, Informative)
a way to fix this by his own term is to stop contributing ... bravo ??? Shouldn't he contribute more instead...that would be better instead of the "fuck it, I quit" attitude
Schekman is the editor-in-chief of eLife [elifesciences.org], a new open-access biomedical journal (so it's a bit personal for him - not that I disagree with his message). Previously he was the editor of PNAS, one of the better publications by non-profit publishers.
Good on him (Score:3)
Re:Good on him -- lets the rest of us have a shot (Score:1)
It's not an issue of early-career vs established scientists -- it's an issue of pedigreed vs. self-made scientists.
Sheckman is saying that he won't support a student's desire to submit a paper to SNC. His students will still have the benefit of being associated with a Nobel Prize winner. I see this as a sort of unilateral disarmament from someone whose influence is assured. Sheckman and his people have already been noticed, so he's letting everyone else have a chance at getting noticed too (by publishing in
Same goes for wider publishing (Score:1)
Fed up with publication pressure (Score:5, Insightful)
Not only are many (most?) academics fed up with the big journals, we are also generally fed up with publication pressure. Our school is just now going through a review. The accreditation people want number of publication. It doesn't matter what you wrote about, or whether you had anything useful to say, it's just numbers.
Who read about the University of Edinburgh physicist: He just won the Nobel prize, and has published a total of 10 papers in his entire career. As he said: today he wouldn't even get a job.
I understand that school administrations want some way to measure faculty performance. But just as student reviews are a dumb way to assess teaching quality (because demanding teachers may be rated as poorly as incompetent teachers), number of publications is a dumb way to assess research quality.
Re:Fed up with publication pressure (Score:4, Informative)
Who read about the University of Edinburgh physicist: He just won the Nobel prize, and has published a total of 10 papers in his entire career. As he said: today he wouldn't even get a job.
You mean Peter Higgs?
Re: (Score:3)
Names aren't important. What is important is his current academic affiliation.
Re: (Score:2)
Re: (Score:2)
Both are important. Believe me, the name matters a lot.
I think you missed the GPs sarcasm. It was rather dry.
Re: (Score:2)
Re: (Score:3)
Publish or perish must go (Score:5, Insightful)
"Publish or perish" is a unique pressure on mid-career academics to churn out publications. It is administrative metric that when applied can lead to career-ending outcomes for academics that are deemed "unproductive" This highly arbitrary metric looks at a number of papers published and sometimes journal impact factor, but it fails to measure scientific contribution to the field. Application of this metric linked to all kinds of scientific misconduct - from correlation fishing expeditions, to questionable practices in formulating research questions, to outright 'data cooking' and fraud.
Re:Publish or perish must go (Score:5, Insightful)
Re: (Score:3)
Properly evaluating other peoples' work is very hard. The comparative evaluation of that work with other people's work is even harder. But in an optimal system, such evaluation is essential. This is one of the fundamental problems of leadership--and universities suck at it.
Publication isn't even the most important category of work output--teaching quality is. But teaching gets shunted aside, because nobody is really taking the time to carefully evaluate the quality of the teaching. Prospective students
Publish or perish must go (Score:4, Insightful)
Re: (Score:1)
On the ground, hiring decisions are made by an academic department's faculty (with oversight from deans, etc. who are themselves researchers)
Re: (Score:3)
The way it should be is that the metrics for performance are the aggregate quality and impact of the work, not the number of publications or the impact factors of the journals they go into. Why doesn't this work? Because administrators generally don't understand the science that they are "administering."
That isn't why it doesn't work. It doesn't work because there's no particularly good objective metric for "quality" or "impact." For "impact" you have number of citations and where the work was published. If you want to get fancy, you can make a metric that takes into account what those citations were. Quality is pretty much impossible to judge objectively, particularly if you want to compare across fields. It doesn't matter how competent or knowledgeable your assessors are--that's not the limiting factor.
Re:Publish or perish must go (Score:4, Informative)
Re:Publish or perish must go (Score:5, Insightful)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Duh, it's not so hard. The scientists could actually bother to replicate more than a tiny sliver of all results published, and citations of papers not replicated could be treated at damning with faint praise.
One thing peer review can not catch is chance aberration in the experimental data (structural aberration is a different matter).
Without actually replicating the significant results, it all d
Re: (Score:2)
Duh, it's not so hard. The scientists could actually bother to replicate more than a tiny sliver of all results published, and citations of papers not replicated could be treated at damning with faint praise.
Firstly, you're addressing a different a problem--repeatability in science--not how to evaluate the performance of an individual. Saying, for example, that a researcher can't get tenure until their results are replicated is too high a bar. Furthermore, what credit to do the people doing the direct replication experiments get? Not a whole lot. Neither can you wait for the field as a whole to validate a researcher's findings, that can take decades. Science often works on a longer time scale than one person's
Re: (Score:2)
The answer is the same as it always is. Don't measure what's easy - measure for the results you want to see. Do you want the janitor to check the restaurant bathroom every 15 minutes, or do you want the bathroom to be spotless after the janitor leaves every 15 minutes. One is easy; just have him sign a sheet in the bathroom. The other requires actual work, spot checks after the janitor leaves the bathroom for instance. Similarly, a factory worker may be easy to assess. Throughput without errors or inj
bad news for his students and postdocs (Score:1)
From the TFA:
"I have now committed my lab to avoiding luxury journals"
This is bad news for his students and postdocs who wish to get a job in the future. Publishing in a 'luxury' journal is almost a requirement for getting a permanent position in scientific research. However much I may agree with his point of view, he also has a responsibility to advance the careers of the promising students and postdocs in his group.
Slashdot wins Nobel (Score:5, Funny)
Re: (Score:1)
Nobel editorial prize.
You mean the Nobel Peace Editorial Prize?
Already doing this in Physics (Score:4, Informative)
Re: (Score:2)
Grandparent is right (Score:2)
I don't have statistics on this, but resubmitting after peer review is the standard way of doing things in my field (cosmology). That doesn't mean submitting the version that appears in the actual journal, with its formatting etc, but the version that passed the peer review, with all the reviewer's comments addressed.
As supporting evidence, here is the license of one of the most heavily used pieces of software in my field, camb:
You are licensed to use this software free of charge on condition that:
Any publication using results of the code must be submitted to arXiv at the same time as, or before, submitting to a journal. arXiv must be updated with a version equivalent to that accepted by the journal on journal acceptance.
If you identify any bugs you report them as soon as confirmed
Journals would not be in a position to try to fight this - nobody reads the jour
Re: (Score:2)
I don't understand why this hasn't been taken up by other fields by now.
Speaking as a member of another field: a mixture of disorganisation and Stockholm syndrome.
Nobel Peace Prize in Medicine and Physiology? (Score:5, Funny)
Shouldn't that be "Peace Prize in Economic Sciences in Memory of Alfred Nobel for Medicine and Physiology"
Publishing in flashy journals is killing quality (Score:2)
In my field (electrochemistry) the last 5/10 years caused a great deal of researchers to move away from the "traditional" journals (Journal of the Electrochemical Society, Solid State Letters, Electrochimica Acta) to the flashier, more general publications (ACS and RSC publications, mostly). These journals are more widely read, so their impact factor is much higher. But most of their content is also mostly irrelevant, and since the public reading them is not a real expert in my field, what is important is t
Flip the tables: have journals bid for papers (Score:3)
Articles are submitted anonymously to a central site. Perhaps rough statistics on the author's past work can be included but nothing more. Each paper sits there for a fixed time period, maybe 3 or 4 weeks. Editors scour the site and bid for which papers they want to put through peer review at their journal. The community can assign ratings (1 to 10 stars in 2 or 3 different categories) to papers to help guide editors. At the end of the 3 or 4 weeks, the authors choose which journal of the ones which applied should get their submission. Journal sends paper to reviewers. Reviewers know which journal sent them the paper but obviously don't know the author names. Reviewers aren't allowed to reject a paper due to it being not novel (the journal already made that value judgement). The reviewers can only make objective scientific critiques. If it fails to get in, authors can send their paper and (optionally) reviews to the next journal on the list. That journal is not allowed to ask for new reviewers if the authors have already supplied reviews and addressed criticisms. Adding too many reviewers invariably results in unrealistic demands on authors. The final anonymous reviews are available as supplemental info following publication; this may decrease the incidence of shitty, biased, reviews.
So this is somewhat like arXiv, but papers not accepted get pulled down (they can be resubmitted) and it's intended to be a gateway to publication.
it's all the same people (Score:3)
Who are the editors at these journals? They're largely former researchers from popular academic research groups.
Who are the government program managers looking at journal statistics to judge research quality? They're largely former researchers from popular academic groups.
Who are the university administrators creating the publish or perish environment? They're largely former researchers from popular academic groups.
These relationships are the defining characteristic of modern scientific research. Despite the heartache and frustration the system causes, it also produces a huge amount of value for the rest of us.
Over the last 30 years, the commercial labs, defense contractors and government facilities have all become subordinate to university R&D. This has combined the metrics university research has traditionally used with the competition of the private sector. If we want to change things, we need to change the basic structure of how we do research again.
We didn't like using private funding as a success metric. Now we don't like using citations as a success metric. Ok, what else can we use?
Re: (Score:1, Troll)
Well, I can't tell whether your a chatterbot or schizophrenic, so I guess you just passed a test for the first time in your life.
Re: (Score:2)