Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Science

Bloggers Put Scientific Method To the Test 154

ananyo writes "Scrounging chemicals and equipment in their spare time, a team of chemistry bloggers is trying to replicate published protocols for making molecules. The researchers want to check how easy it is to repeat the recipes that scientists report in papers — and are inviting fellow chemists to join them. Blogger See Arr Oh, chemistry graduate student Matt Katcher from Princeton, New Jersey, and two bloggers called Organometallica and BRSM, have together launched Blog Syn, in which they report their progress online. Among the frustrations that led the team to set up Blog Syn are claims that reactions yield products in greater amounts than seems reasonable, and scanty detail about specific conditions in which to run reactions. In some cases, reactions are reported which seem too good to be true — such as a 2009 paper which was corrected within 24 hours by web-savvy chemists live-blogging the experiment; an episode which partially inspired Blog Syn. According to chemist Peter Scott of the University of Warwick in Coventry, UK, synthetic chemists spend most of their time getting published reactions to work. 'That is the elephant in the room of synthetic chemistry.'"
This discussion has been archived. No new comments can be posted.

Bloggers Put Scientific Method To the Test

Comments Filter:
  • by damn_registrars ( 1103043 ) <damn.registrars@gmail.com> on Monday January 21, 2013 @09:15PM (#42652853) Homepage Journal
    The bloggers are not testing the scientific method, they are testing methods that are scientific. Those are two vastly different concepts. Their work is important, but not epic.
  • by WrongMonkey ( 1027334 ) on Monday January 21, 2013 @09:41PM (#42652979)
    It's not a secret that about half of published synthesis methods are garbage and yield values are wildly creative. Reviewers don't have the means to verify these, so anything that seems plausible gets published. Then researchers are left to sort out the best methods based on which ones get the most citations.
  • by flink ( 18449 ) on Monday January 21, 2013 @09:59PM (#42653097)

    Right, but they are utilizing the scientific method to test the quality of published papers, not attempting to verify the utility of the scientific method itself.

    The headline should read "Bloggers apply scientific method to validate published findings".

  • by phantomfive ( 622387 ) on Monday January 21, 2013 @10:06PM (#42653137) Journal

    "Bloggers apply scientific method to validate published findings".

    A much better headline.

  • by paiute ( 550198 ) on Monday January 21, 2013 @11:23PM (#42653553)

    Frankly, Chemistry is among the easiest of the physical sciences. I say this as the physicist who was tasked by the chemists to fix their gear when it broke down.

    Organic chemistry is quite difficult. The purpose of synthesis is not as you suppose, just mix A and B, see what happens and publish. Most organic chemists are trying to make specific transformations on certain parts of molecules in high conversion and trying to control the variables of time, temperature, concentration, reagent reactivity with substrate functional groups, etc.

    Physics is just a block on an inclined plane and variations.

  • by __aaltlg1547 ( 2541114 ) on Tuesday January 22, 2013 @01:32AM (#42654225)
    No, it means that the original experimenters didn't describe their experiment correctly. Or worse, may have never done it at all...
  • Not reproducible (Score:3, Insightful)

    by jotajota1968 ( 2666561 ) on Tuesday January 22, 2013 @08:51AM (#42655821)
    I am a physicist. I do believe that a large percentage (less say 50%) of scientific publications do not meet basic quality standards, let alone the scientific method. It depends also on the level of the journal. But even in the best journals you can find articles and results that can not be reproduced. The pressure to publish is too strong. Anyway, 70% or more of the articles are only academic exercises, or do not have robust statistics or do not receive more than one or two citations (including a couple of my own). Only the best of the articles (should) prevail with time (Darwinism).
  • by vlm ( 69642 ) on Tuesday January 22, 2013 @09:05AM (#42655871)

    A scientist should also run experiments multiple times to see if the results are repeatable before publishing those results.

    Won't help. I studied to be a chemist (admittedly a long time ago) and by far the biggest non-ethical problem out there is contamination.

    So it turns out that your peculiar reaction you're studying is iron catalyzed, in fact its incredibly sensitive to iron, but no one in the world knows that yet. And your reagents are contaminated. Or your glassware, which you thought was brand new and/or well cleaned, is contaminated. Or your lab is downwind of a hematite ore processor and the room dust is contaminated.

    Sure, you say, test everything. Well there isn't time/money for that, but for the sake of argument we'll assume there is. What if dust from the hematite ore processor is far larger than the filter paper pores in filtration stage of your overall process? Test the reagents and product all you want but you'll never find iron anywhere except room dust (which you already knew about) and the debris in the filter paper (which you assume was contaminated by room dust AFTER removal from the apparatus)

    The most important thing is this is the norm in chemistry, not an outlier. Chemistry is not math or CS, sometimes stuff just doesn't work or just works for no apparent reason. Unlike some technologies, detailed modeling of "why" "how" often doesn't happen for years, decades, centuries after the ChemEng team has been selling product / papers have been written.

    A very important lesson is analysis paralysis. So you live downwind of a hematite ore crusher. And you know it. And periodic tests of your lab show iron enriched house dust. But you can't go around testing everything, because you're surrounded by millions of things to test for. You're a gardener, god only knows whats on your hands. Skin oil of certain blood types is a contaminant? Your breath has a tinge of ethanol in it from last night? Maybe its your perfume / cologne / antiperspirant / nail polish? The point of discovery is its literally unknown... maybe wearing nitrile gloves instead of old fashioned latex "does something" good or bad to the reaction.

    I think the main thing "slashdotter IT people" need to understand is most chemistry and most chemical engineering runs somewhat less than 6 sig figs. This is incomprehensible to IT people... if your T1 or whatever LAN had a BER worse than 1e5 you'd call it in for repair... If you got one thousand read errors when you read a 1 gig DVD, you'd throw out that DVD. If your processor runs at 1500 mips then at a six sig fig error rate it would crash about 1500 times per second. The bad news is six sig figs is actually pretty good work for a chemistry lab. Certainly undergrads could never aspire it that level, both skill and experience and specialized equipment....

    Please spare me the details of one peculiar quantitative analysis technique that in one weird anecdote measured once in tiny fractions of a ppt. The overall system cannot be "cleaner" than the filthiest link in the entire system. Is the hand soap in your lab spectrographically pure? The unused toilet paper in the bathroom? All of your hoods and benches and storage cabinets are in a verified and tested cleanroom environment? Seriously? The drinking cooler and lab fridge also only hold spectrographically pure substances? Please no anecdotes.

"Gravitation cannot be held responsible for people falling in love." -- Albert Einstein

Working...