Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Science

Firms Churning Out Fake Papers Are Now Bribing Journal Editors (science.org) 32

Nicholas Wise is a fluid dynamics researcher who moonlights as a scientific fraud buster, reports Science magazine. And last June he "was digging around on shady Facebook groups when he came across something he had never seen before." Wise was all too familiar with offers to sell or buy author slots and reviews on scientific papers — the signs of a busy paper mill. Exploiting the growing pressure on scientists worldwide to amass publications even if they lack resources to undertake quality research, these furtive intermediaries by some accounts pump out tens or even hundreds of thousands of articles every year. Many contain made-up data; others are plagiarized or of low quality. Regardless, authors pay to have their names on them, and the mills can make tidy profits.

But what Wise was seeing this time was new. Rather than targeting potential authors and reviewers, someone who called himself Jack Ben, of a firm whose Chinese name translates to Olive Academic, was going for journal editors — offering large sums of cash to these gatekeepers in return for accepting papers for publication. "Sure you will make money from us," Ben promised prospective collaborators in a document linked from the Facebook posts, along with screenshots showing transfers of up to $20,000 or more. In several cases, the recipient's name could be made out through sloppy blurring, as could the titles of two papers. More than 50 journal editors had already signed on, he wrote. There was even an online form for interested editors to fill out...

Publishers and journals, recognizing the threat, have beefed up their research integrity teams and retracted papers, sometimes by the hundreds. They are investing in ways to better spot third-party involvement, such as screening tools meant to flag bogus papers. So cash-rich paper mills have evidently adopted a new tactic: bribing editors and planting their own agents on editorial boards to ensure publication of their manuscripts. An investigation by Science and Retraction Watch, in partnership with Wise and other industry experts, identified several paper mills and more than 30 editors of reputable journals who appear to be involved in this type of activity. Many were guest editors of special issues, which have been flagged in the past as particularly vulnerable to abuse because they are edited separately from the regular journal. But several were regular editors or members of journal editorial boards. And this is likely just the tip of the iceberg.

The spokesperson for one journal publisher tells Science that its editors are receiving bribe offers every week..

Thanks to long-time Slashdot reader schwit1 for sharing the article..
This discussion has been archived. No new comments can be posted.

Firms Churning Out Fake Papers Are Now Bribing Journal Editors

Comments Filter:
  • by Baron_Yam ( 643147 ) on Saturday February 03, 2024 @02:01PM (#64210858)

    Set out rules for what defines minimum quality research, review, and publication standards. Determine appropriate penalties for breaching them and publishing. (Obviously you have to allow for fools and amateurs, which is why you wouldn't have the rules kick in until the publication stage)

    Now get your legislators to enshrine them in law, and have those laws enforced.

    Whatever nation does this first will come to have the most respected and trustworthy science publications, and it could even get to the point where nationals of other countries choose yours to publish simply because it would add credibility to their research.

    • The best 'sunshine,' where possible, is replicability. Computer science / AI is fortunate here. Increasingly if you don't release your code (and model weights in the case of AI), people don't much care what you say. If it proves out and goes into wide use, then you're renowned. Fake that.
      • That can work in the computation fields, where duplicating results might be as easy as downloading some code, followed by 50-100 mouse clicks. Not nearly as easy for physical science/engineering, when replicating a paper result requires a million dollars of specialized lab equipment, 2 or 3 phd students, postdocs or research scientists, and a year or two of time. Oh, and as a researcher, you will get basically zero professional credit for burning a year of your life doing it. The credit goes to the original
        • Maybe the economic calculus is in India and China.
        • >. Oh, and as a researcher, you will get basically zero professional credit for burning a year of your life doing it.

          This really ought to change, but I imagine there's a LOT of cultural inertia in the sciences that would resist it.

          Your discovery, unless one that is one of the rare obvious-in-retrospect ones, should not be considered until an independent, reputable team has verified it. And that team ought to get at a reasonable fraction of the credit and any benefits that flow from it, because your disc

    • passing NEW laws when the system is not willing to enforce existing laws, other than to enable future oppression and political persecution.

      We already have plenty of laws regarding fraud, and bribery etc. We also have laws governing racketeering [wikipedia.org]. Such laws, in just and orderly society, are to be applied equally to anybody found to have violated them. It is corruption when such laws are either selectively enforced (only certain people prosecuted) OR selectively not enforced (certain people are allowed to get

      • In this case, I suspect it is difficult to find someone with standing to sue, though I am not a lawyer.

        You don't necessarily need a new law if you can simply attach a new clause to an existing one, but I think the amount of care you'd want to take for this specific class of offence would likely require it.

    • Set out rules for what defines minimum quality research

      Journals already have minimum standards but this is always going to be a subjective criterion that needs to be made by a human since part of the quality is the importance to the field. You cannot possibly enshrine it in law since we cannot afford to have a law preventing the publishing of important new research that, because of its new nature, does not match whatever criteria were established in law. It's also a lot easier to make up papers that sound good than it is to actually do good research so it is h

  • is to drown in its own bullshit.
    • We could define this as fraud, and also define a corporate death penalty for organizations that abuse the public's trust.

      We won't do any of that, but at least I can be satisfied that it's possible and we chose not to do it.

    • The currently-hyped "AI" excels at consuming and generating BS. Aside from the potential collapse of the BS economy, maybe we're facing a different kind of "gray goo" apocalypse than the nanomachine folks envisioned.

      • What is the difference between the nanomachine "gray goo" and the kind between the ears of scamsters who write bogus scientific papers?
  • by MpVpRb ( 1423381 ) on Saturday February 03, 2024 @02:17PM (#64210898)

    ...the number of papers published and those who publish fewer are punished, this kind of behavior will naturally arise
    We need a different metric for quantifying quality science and the value of researchers

    • Who exactly measures success like this?

      Pretty sure its only universities.

      Corporations measure success by number of licensable patents and so forth.
      • I saw something on this in The Guardian [theguardian.com] earlier today. Quoting from that,

        The startling rise in the publication of sham science papers has its roots in China, where young doctors and scientists seeking promotion were required to have published scientific papers. Shadow organisations – known as “paper mills” – began to supply fabricated work for publication in journals there.

        The practice has since spread to India, Iran, Russia, former Soviet Union states and eastern Europe,

        and

        “I

    • This is what mystifies me. At least in the fields I'm familiar with nobody really looks at the number of publications, it is the quality that counts for vastly more. Having one paper in a top journal like Nature, Science etc. is worth far more than any number of papers in an unknown journal.

      I suspect they are perhaps preying on junior researchers who do not understand this.
  • I don't know. This sounds awfully close to "a secret plan by a group to do something unlawful or harmful".

    That would make anyone that believes this a conspiracy theorist and also someone that doesn't follow "the science".

    This could put into question a whole lot of science around COV.... ahhh... who cares, everyone has collected their money anyways.

  • Over thirty years of ago, I gave up all hope of reading even a fraction of the publications in my general field of knowledge. Publish or perish filled the literature with junk papers, that contained nothing really new and no useful information. I found lots of papers in high end journals that were something or other part V and only referenced parts 1 to 4, which strongly suggests that the authors had not even bothered to do a literature review on the subject. Now with pay to publish journals and the int
  • I get unsolicited applications to join my lab on an almost daily basis. Many of them come from the parts of the world mentioned by other posters, like China, Iran, Russia, etc. I scan their CVs, just on the outside chance there's a hidden gem, and I am stunned by the number of publications they have --- in journals I've never heard of before. Sometimes, for fun, I check the impact factor of these journals and they're at or below 1. That means no-one is citing those papers. They're junk.

    Things like curat

    • by davidwr ( 791652 )

      Sometimes, for fun, I check the impact factor of these journals and they're at or below 1. That means no-one is citing those papers. They're junk.

      Things like curation by journal editors and measurement of that through impact factor *are* still useful.

      Just be sure to eliminate "paper-mill-journal citations of paper-mill-journals" that inflate the impact factor.

  • At one point science journals where run by schools, but then economic pressure lead to them being outsourced to specialized publishers. Having unpaid reviewers to evaluate papers goes back to the era before journals were stand alone businesses.

    Over time it became a racket. Due to the "publish or perish" nature of academic life, publishers realized they could charge outrageous fees for access to their content. The authors have to provide to content for free, the reviews and much of the editorial work is don

  • When looking for papers one can exclude everything from China and India.

"It's the best thing since professional golfers on 'ludes." -- Rick Obidiah

Working...