Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Science

Have Scientific Breakthroughs Declined? (japantimes.co.jp) 114

Some researchers say we've seen a fall in disruptive new discoveries. But we may be entering a golden age of applied science. From a report: 2023 had barely begun when scientists got some jolting news. On Jan. 4, a paper appeared in Nature claiming that disruptive scientific findings have been waning since 1945. Scientists took this as an affront. The New York Times interpreted the study to mean that scientists aren't producing as many "real breakthroughs" or "intellectual leaps" or "pioneering discoveries." That seems paradoxical when each year brings a new crop of exciting findings. In the 12 months following that paper, scientists have listened to the close encounters between supermassive black holes, demonstrated the power of new weight loss drugs and brought to market life-changing gene therapies for sickle cell disease.

What the authors of the January paper measured was a changing pattern in the way papers were cited. They created an index of disruptiveness that measured how much a finding marked a break with the past. A more disruptive paper would be cited by many future papers while previous papers in the same area would be cited less -- presumably because they were rendered obsolete. This pattern, they found, has been on a decades-long decline. One of the authors, Russell Funk of the Carlson School of Management at the University of Minnesota, said they wanted to measure how new findings shifted attention away from old ways of doing things.

"Science definitely benefits from a cumulative work and studies that come along and refine our existing ideas. But it also benefits from being shaken up every now and then," he said. We're seeing fewer shake-ups now. Funk said he thinks it's related to funding agents taking too few risks. But others say it may only reflect changes in the way scientists cite each other's work. Scientists I talked to said researchers cite papers for many reasons -- including as way to ingratiate themselves with colleagues, mentors or advisers. Papers on techniques get a disproportionate number of citations, as do review articles because they're easier to cite than going back to the original discoveries. Citations in papers are "noisy data" Funk admitted, but there's a lot of it -- millions of papers -- and such data can reveal interesting trends. He agreed, though, that people shouldn't conflate disruption with importance. He gave the example of the LIGO (the Laser Interferometer Gravitational-Wave Observatory), which made a big splash in 2016 by detecting gravitational waves, long ago predicted by Einstein. By his definition it was not disruptive.

This discussion has been archived. No new comments can be posted.

Have Scientific Breakthroughs Declined?

Comments Filter:
  • It was said that Tang was invented for use on the Space Shuttle. That's the last important invention!

    (yes, sarcasm... and yes, I know it wasn't invented for the Space Shuttle)

    • https://www.seattletimes.com/life/food-drink/the-taste-of-space-tang-sucks-the-truth-about-so-called-astronaut-ice-cream-and-vegans-for-the-win/

      https://www.foodandwine.com/lifestyle/how-nasa-made-tang-cool

  • I dunno (Score:5, Insightful)

    by MightyMartian ( 840721 ) on Monday January 08, 2024 @12:27PM (#64141075) Journal

    Just how common is "disruptive" science? While there have certainly been major breakthroughs over the years (indeed centuries), what we often don't hear about is the slow and steady work that leads up to the breakthrough. Even something as truly earth-shattering as evolutionary theory was built on top of work by Darwin's antecedents, in particular Linnaeus's taxonomies, which, when put under the light of natural selection, just sort of clicked for pretty much every 19th century naturalist. The same goes for Special and General Relativity, all of which were built atop the work of Einstein's predecessors like Lorentz and Maxwell.

    • Agreed. I'd argue since we've finally (sort of) gotten more energy out of fusion than we put in, that can be pretty disruptive. On a day to day basis what we identify as 'disruptive' is something that was actually discovered a while earlier and then development on it continued until it really gets the attention of the populace. I remember in the 90s having someone ad mention 'the new and totally disruptive technology called 'peer to peer sharing' - er, if by 'disruptive' you mean 'something invented in
      • Getting more energy out of fusion is not new science; it's applied science.

        • It is a significant breakthrough, however. My observation about applied science is that it's a known area that we are applying known principles. That's not quite the case here.
        • Depends if practical fusion (if it exists) is very different than stellar fusion. We currently know how fusion in stars work but getting a large amount of hydrogen in one place and waiting around for a million years or so is not practical. The promise of cold fusion has not borne any viable pathways.
    • Re:I dunno (Score:5, Interesting)

      by HiThere ( 15173 ) <charleshixsn@ear ... .net minus punct> on Monday January 08, 2024 @12:49PM (#64141183)

      Also true is that the major disruptive discoveries/inventions often take decades to be appreciated. The laser was "invented" in the 1940's, but couldn't be built at the time. ISTR that it was built during the 1970's (no, it was in 1960), but it took a long time to turn it into something useful.

      The thing is, inventing something is often the easy part. Building the first one often creates something that isn't very useful...or is used for something unrelated to the "important" later uses. Rockets were invented in medieval China and used mainly for fireworks displays.

      What was the most important recent invention? Who knows. It could be LLMs, it could be CRISPR, it could be something that didn't even make headlines. We'll judge that reasonably in about a decade...but be prepared for folks to later change their minds.

      • Maser came first. The science was metastable energy levels. Maser itself and later laser were applied science.

        • Also stimulated emission which was theoretically discovered by Einstein in the 1910s at some point. I thought the maser was the first experimental proof that stimulated emission was real.

          • by HiThere ( 15173 )

            Sorry, I heard the math hadn't been worked out until the 1940's. OK, the laser was invented in the 1910's.

            • I think there was a variety of bits. What I meant was the maser wasn't just applied science, it also confirmed some theoretical predictions (i.e. stimulate emission) for the first time.

      • Most the useful things now come from adding elbow grease to known phenomena. That requires more engineering know-how and hands-on tuning than pure science.

        I believe we tapped out most low-hanging fruit of raw science and to get to the next step will require smarter controls and processes to leverage prior discoveries.

        I'm sure there will be interesting new raw-science discoveries, but less frequent than the past.

        • by HiThere ( 15173 )

          I have a wild guess that important discoveries have a frequency analogous to the frequency of prime numbers...which pretty much agrees with your suggestion.

          OTOH, when you have more people looking, you expect the rare events to show up more quickly.

    • Re: (Score:2, Insightful)

      by guruevi ( 827432 )

      According to the author, it was more common in the past. Disruption is what gets you sent to the principal's office, so disruption for many people is beaten out of them by the time they are adults. Most people don't like disruptors, it's risky, it's costly, better go with the flow, parrot the narrative, don't say anything offensive. If you have that kind of attitude you will get a society that largely doesn't have disruption. If everything is the same and there is no individualism, innovation stalls and you

      • As scientists understand the natural world better, there is less that they are wrong about, and hence less disruption where previous citations drop away because they were profoundly mistaken.

        Quantum mechanics and relativity and DNA and evolution are not going to go away.

        There are many more people working on similar tasks compared to 150 years ago and so there is a more continuous stream of research, and because of that there are fewer large gaps to leap over with a singular disruptive paper, as there was al

        • by guruevi ( 827432 )

          So basically what you're saying is that we know everything and therefore don't need to innovate. That's exactly what I was saying is bad about the last 100 years of education.

          The point of the article was that there are many more people working on tasks and both proportionally and absolutely there are less results. It's not because we know everything, far from it, it's because funding has shifted largely to government which makes the enterprise of science afraid of taking risk as it gets rewarded for towing

          • So basically what you're saying is that we know everything and therefore don't need to innovate.

            I love your swinging from one wild extreme to another. If we don't know nothing then we must know EVERYTHING11!111one!1ONELEVEN111!11

            No. We know more than we used to. We are less wrong.

            it's because funding has shifted largely to government which makes the enterprise of science afraid of taking risk as it gets rewarded for towing the line

            This is you just makin' shit up that aligns with your politics.

            For example,

    • Re:I dunno (Score:5, Funny)

      by 93 Escort Wagon ( 326346 ) on Monday January 08, 2024 @01:36PM (#64141293)

      Just how common is "disruptive" science?

      Judging by my university's press releases, it's a daily occurrence.

      • Judging by my university's press releases,

        ... written by people who couldn't get a science job with their English degrees.

    • Physics (Score:5, Insightful)

      by Roger W Moore ( 538166 ) on Monday January 08, 2024 @02:24PM (#64141439) Journal

      Just how common is "disruptive" science?

      That depends how you define "disruptive". If you mean "unexpected" then the drop in such surprises is likely due to theorists, often armed with computers, who can explore possibilities well before we on the experimental side can get there. Think of it like exploration after the age of satellites: the area you are going to explore has already been viewed from a distance and so while there may still be significant discoveries to be made e.g. caves, small geological formations etc. these are surprising on a smaller scale. This is why finding the Higgs boson was a major breakthrough but could hardly be described as "disruptive" because it was exactly what we expected thanks to the theorists: instead, had we excluded its existence that would have been disruptive!

      If you take "disruptive" to mean " causes us to rethink our fundamental understanding of the universe" then that has really only happened twice in Physics. Once with Newton in the early 18th century and again with Einstein in the early 20th century so it is very infrequent. However, I would draw some parallels between physics in the late 19th century and physics today. In both cases, it looked like we had most things explained with just a few minor problems left to sort out. In the case of the 19th century, it was things like the medium that light propagates through, the UV catastrophe and why Maxwell's equations broke Newtonian relativity. Today it's things like Dark Matter, why the Higgs is so light and the failure of models of Quantum Gravity.

      It took several decades to solve the "minor" problems at the end of the 19th century but their solution gave us a very disruptive twin breakthroughs of relativity and quantum mechanics. We've now run with those for a century and are having problems making more progress which suggests that perhaps the solutions to some of the issues we have today will require a similar unexpected and disruptive breakthrough in the not-too-distant future. Let's hope so but whether that will happen in 5 years or 50 is impossible to predict...which is the very nature of a disruptive breakthrough: if, like the Higgs boson, you could predict it, it would hardly be disruptive!

      • If you take "disruptive" to mean " causes us to rethink our fundamental understanding of the universe" then that has really only happened twice in Physics. Once with Newton in the early 18th century and again with Einstein in the early 20th century so it is very infrequent.

        Also once with Copernicus (not the first, but the first person to really make it stick and upend things).

        Not Physics, but Darwin really caused a fundamental rethink in the understanding of the universe, in that it wasn't in fact just put

        • One of the stunning things about Darwin's theory was how, in very short order, pretty much every naturalist out there went "Yup, that's the answer". There were certainly ecclesiastical objections, but within the scientific community within a few years of the publishing On The Origin Of The Species, evolution wasn't really a topic of controversy. All the controversy came from outside scientific circles, as the full horror of someone having developed a scientific theory that, gasp, inferred (even though Darwi

      • by jythie ( 914043 )
        Yeah, I think a big part of it is that over the last century we went from being 'mostly wrong' to 'mostly right', which is a one time transition. We will learn more and more, but a lot of people seem to expect the 'mostly right' to turn out to be 'mostly wrong' so they can fulfill their fantasy future with a whole new 'mostly right'.
    • It seems that many posters are reinventing the work of Thomas Kuhn in 1962 in his "The Structure of Scientific Revolutions." A scientific revolution creates a new paradigm (yes, that's where we get that word) that replaces an older paradigm. Einstein vs. Newton is a perfect example. Once the new paradigm is accepted by all or most of the revolutionary's peers, all future scientific work is based on the new paradigm. Future work is performed by refiners as opposed to revolutionaries. All scientists would lik
  • Well, to be fair (Score:5, Interesting)

    by zkiwi34 ( 974563 ) on Monday January 08, 2024 @12:33PM (#64141103)
    There seems to be lots and lots of work, but more and more of that work is completely rubbish.
    • by HiThere ( 15173 ) <charleshixsn@ear ... .net minus punct> on Monday January 08, 2024 @12:56PM (#64141197)

      You've been corrupted by histories that only list the successes, and not the failure and dead ends. (Epicycles were a success, even though we've got better ways to predict sky positions now.)

      There are ALWAYS a lot more failures than successes. And there are always a lot more "advances" that are dead ends. But it's the successes that matter. (Still, winnowing out the frauds would make the process a lot more efficient...but you'll still get a lot more failures than successes.)

      • Some of the most creative, fascinating science I've ever read is the era right after thermodynamic theory got really solid but nuclear theory was barely in its infancy and everyone was falling all over themselves to explain wait HOW THE FUCK DOES THE SUN WORK THEN
      • by jythie ( 914043 )
        One of the other big issues is history and zifp's law.. the popular imagination only has room for a few well known names, maybe one per era per discipline, so people end up expecting some singular super-scientist to be the solution to everything.
      • I think zekiwi's point was about the something-under-50% rate of reproducibility in modern scientific publishing.

        LITERALLY most published "science" is, indeed, rubbish.

        I don't believe that was the case in the early 20th century.

      • by mjwx ( 966435 )

        You've been corrupted by histories that only list the successes, and not the failure and dead ends. (Epicycles were a success, even though we've got better ways to predict sky positions now.)

        There are ALWAYS a lot more failures than successes. And there are always a lot more "advances" that are dead ends. But it's the successes that matter. (Still, winnowing out the frauds would make the process a lot more efficient...but you'll still get a lot more failures than successes.)

        Indeed, John Logie Baird, inventor of the television invented many other things and television wasn't even a successful invention for Baird. Baird made most of his money from thermal socks (the Baird undersock) which he invented because he had very poor circulation, ergo, cold feet, yet we all know television but have no idea about the Baird undersock. He also invented shoes with air pockets in the soles, which was adopted by Dr Marten brand.

  • The easy shit has been done. You're not going to just stumble on a brand new, simple, distruptive thing. New significant discoveries will take huge amounts of time, money, and other resources.

    Seems perfectly correct that the pace of disruptive innovation is slowed.

    • by jd ( 1658 )

      That's only true if you assume education stands still or regresses.

      In physics, for example, much of the new work is being done in information physics, which doesn't require a big budget. Hawking Radiation, in quantum cosmology, was proven with a cheap analogue in someone's laboratory. We're nowhere near done on the cheap physics. But it takes minds that aren't shackled by having to churn at tedious rates through contemporary schools. Eliminating streaming and slowing down the bright kids has simply reduced

    • Its also I think the case that there is a difference between scientific discoveries, engineering advancements, and engineering maturations.

      Take 2023 and AI. It *feels* like AI took a major advance in 2023. But all it was, was the maturation of the Transformer model, which is an engineering advancement from 5 years prior based on the RNN from 20 years prior , itself an iteration of the Perceptron from 30 years prior to that. But its the *implementation* , the engineering maturation, that created the shockwav

  • Serious question:

    Has there ever been a real breakthrough theory backed up by replicated experiments in the "science" of psychology?
    • by Okian Warrior ( 537106 ) on Monday January 08, 2024 @02:34PM (#64141483) Homepage Journal

      Serious question:

      Has there ever been a real breakthrough theory backed up by replicated experiments in the "science" of psychology?

      Yes, the Big 5 personality model [wikipedia.org].

      The theory begins with the "Linguistic hypothesis", which says that the ability to characterize someone's personality is so important to survival that our language has evolved to describe it accurately.

      Then you survey all the words in English that describe someone's personality: easygoing, friendly, hard working, nervous, and so on, and see which words are associated with other descriptive words. You do a multivariate analysis and see if the words clump together into identifiable islands of meaning.

      When you do that for English, there are 5 clumps. Use a reasonably representative word to describe each clumping, and you get: Openness, Conscientiousness, Extroversion, Agreeableness, and Neuroticism.

      Because no one in the field believed the theory, a lot of tests tried to disprove it - which all failed. The Big 5 is not a single paper reporting on one experiment, it's been done over and over. It's been done for different cultures, different languages, even historical languages, and the results are still the same: personality can be accurately described by 5 traits.

      You might be aware of the Meyers Briggs [wikipedia.org] personality indicator, the one with 4 letters (example: INTJ) to describe your personality. The predictive power of the Meyers Briggs system is zero: the Meyers Briggs is not associated with variance in any other psychological measure such as success or number of friends.

      The big 5 trait measurements have a very strong predictive power: 40% of your success in life is due to intelligence, but 30% is due to your conscientiousness. Men and Women have reliably different measures of agreeableness and neuroticism, while Conservatives score high in Conscientiousness and Liberals score high in Openness. This has led to a revolution in our understanding of why people act the way they do, and what their motivations are.

      Additionally: the method for curing phobias is now settled science. If you have a fear of, for example, heights there are ways to cure this that work and there are ways that don't work and will make the problem worse. Backed by study and experiment, it's used by just about everyone in the field to address this specific problem in their patients.

      Modern psychology is based quite a bit on strong scientific findings.

      • Thanks for that.

        But the Big Five is the ONLY psychometric test that has any scientific consensus behind it.

        > Modern psychology is based quite a bit on strong scientific findings.

        Modern psychology has a gross replication crisis. More "occasionally" rather than "quite a bit".
    • Might as well ask is Chiropractors are real doctors, if there is a mathematical basis for the 2% inflation target, if nicotine should be treated as a controlled substance, does Bitcoin have any intrinsic value, and so forth.

      Once you start actually digging you will either be mad, or go mad. Sometimes both.

      • Depending on the country they are real doctors. What else would they be?
        Chiropractics is a section of Orthopaedics. The word does not mean why you think it means (you must be american?).

  • The last real game changer was the discovery of the semiconductor. Everything afterward has been incremental. Name biggest breakthrough since the transistor.

    • Re: (Score:3, Informative)

      by groobly ( 6155920 )

      Um, superconductivity?

    • Re:Kind of (Score:4, Interesting)

      by serviscope_minor ( 664417 ) on Monday January 08, 2024 @01:48PM (#64141331) Journal

      The last real game changer was the discovery of the semiconductor. Everything afterward has been incremental. Name biggest breakthrough since the transistor.

      Yet somehow we've gone from 1953's Transistor Computer with transistors lower power but LESS RELIABLE than the valves they replaced (it could reportedly run for about an hour) and which was slower than the existing valve based machines to, well, modern devices using only "incremental improvements".

      • Yes, incremental. Other than changing from germanium to silicon, a transistor from today is exactly the same as the 1947 prototype. Sure they have gotten orders of magnitude smaller but it’s still a semiconductor.

        • Other than changing from germanium to silicon, a transistor from today is exactly the same as the 1947 prototype.

          You may as well say they that they operate the same way as a triode with one terminal controlling the flow of current between two others. Point contact transistors don't work the same way as either BJTs or the variety of FETs on offer.

          Sure they have gotten orders of magnitude smaller but itâ(TM)s still a semiconductor.

          So's a crystal diode as discovered in 1874, and used for radio receivers b

        • No they are not the same thing. Fundamentally they work completely different.
          The originals and early decades worke by semiconductor/diod effects. in our times they are FETs - field effect transistors. In some cases a single atom is the switch.

    • by Anonymous Coward

      Name biggest breakthrough since the transistor.

      DNA/genes.

      • by jd ( 1658 )

        Agreed. There have been several breakthroughs there - mtDNA sequencing, yDNA STR counting, genotyping, whole genome sequencing, and the other side of the coin, CRISPR9 gene editing.

    • by Cyberax ( 705495 )

      Name biggest breakthrough since the transistor.

      The whole field of molecular biology.

    • The integrated circuit.

    • by jd ( 1658 )

      Graphene, metamaterials, quantum tunnelling (without which, you couldn't have modern chips), superconductivity, ultrapure silicon 28, ultrapure silicon-29, GaAs semiconductors, optically stimulated fluorescence, thermally stimulated fluorescence, synchrotron radiation, classical lasers, semiconductor lasers, optic fibre, holography, magnetoencephalography, magnetic resonance imaging.

    • In cosmology: the age of the universe, the size of the known universe, confirmation of black holes. Heck the Hubble Space Telescope could be an example of many of these changing significant breakthroughs.
  • by Sique ( 173459 ) on Monday January 08, 2024 @01:14PM (#64141239) Homepage
    The measurement of Disruptiveness could be way off, because of the way it is measuring it.

    How many papers actually cite "Zur Elektrodynamik bewegter Körper" (On the Electrodynamics of Moving Bodies)? Nevertheless, this is the paper that for the first time revealed Special Relativity.

    (Interestingly though, I've seen lots of Youtube videos recently which use Special Relativity to explain Magnetism. But curiously enough, no video ever mentions that Albert Einstein actually developed Special Relativity to explain magnets in the first place.)

    • by mbkennel ( 97636 )

      Actually of the "OG" papers, Einsteins on SR gets more citations than most old ones, particularly the English translation.

      Special relativity does not explain ferromagnets, though otherwise electromagnetic theory was primary.

      Einstein completed the one thing that Maxwell didn't get in classical electrodynamics, and then took the extraordinary leap that the recent Maxwell theory was fully correct as is and the successful and incontrovertible Newton needed modification---the opposite of every one else working o

    • by jd ( 1658 )

      Well, ish.

      https://arxiv.org/abs/1112.317... [arxiv.org]

      As noted by others, Poincare actually got there first.

      • by Sique ( 173459 )
        Yes and no. Henri Poincaré got the formulas right and co-published them with Hendrik Aanton Lorentz, hence we call them Lorentz transformations. But Henri Poincaré could not do away with the ephemeral Ether. Albert Einstein corrected Maxwell's electromagnetic equations to make them look the same independent of the movement of either the current carrying conductor or the magnet, so only their relative movement was important, hence the name "Relativity". Additionally, in the same year, he explained
  • by rsilvergun ( 571051 ) on Monday January 08, 2024 @01:34PM (#64141277)
    We took out a lot of low hanging fruit. I mean it's really only within the last 100 years if that that he is a species have made a concerted and serious effort to apply the scientific method to increase our knowledge and we still have to fight tooth and nail against the owner class to get the resources needed to continue to expand that knowledge.

    The people who control all the money don't look at the expansion of scientific knowledge as a inherent good they're nervous about it because scientific breakthroughs have the potential to upset their control of the economy. For example we've had massive improvements to solar and wind power and improvements to electric car batteries that are already reducing the amount of fossil fuels used. If you think the Saudi Arabian king is going to sit idly by while that happens be crazy and they have certainly done everything they can to slow the process.

    And that's a relatively uncontroversial example of it. For a more controversial one birth control and generally increased education is causing birth rates to plummet pretty much everywhere they are implemented in a modern society. That's going to have broad impacts on the power base because as populations decline the lack of a surplus population makes it harder to treat your workers like garbage. That's why I post world War II workers had it so good we had killed off so many working age men. Same thing with the black plague.

    So you've got the low hanging fruit already knocked out coupled with a powerful group of men and women who would really like the slow things down so they can make sure they retain control of everything. It is absolutely no surprise we're seeing a general slowdown in innovation.

    Like a buddy of mine says if the 1% understood the internet they never would have let us have it
    • One of the main reasons for reduced bithes were:
      a) Electricity and artificial light, people worked longer at home, than they did with candle light
      b) the TV, suddenly people spent night time with a different entertainment, rather then just only making babies

      There also was a shift in values. Bismarck introduced health care and retirement funds. The older generation did not depend on having enough (surviving) kids, to support them at old age.

  • by hdyoung ( 5182939 ) on Monday January 08, 2024 @01:49PM (#64141335)
    Over the last frikkin CENTURY, with the expectation that underlying practices havent changed at all? And trying to draw conclusions about scientific progress? Ugh. Where to start. News flash, most of that time was pre-internet. Hell, for most of the 1900s, if you wanted to generate a citation list, you needed to go to a specialized library, IN PERSON AND DIG THROUGH ACTUAL PHYSICAL BOOKS yeah I know that sounds weird. But wait there;s more. Two thirds of your sources were in either Russian or German, and news flash google translate didn’t exist, so you were limited to the articles that had already been translated (only the most important ones), or you had to hunt down someone to translate for you (expensive/difficult) or you brushed up on your technical Russian/German skills which was an actual thing back then for scientists.

    The result - old articles will typically have like 5-10 citations. Only the most important ones. And damn it took a lot of effort to generate those. Whereas nowadays I can build a 75-article citation list in just a few hours sitting on my couch. And the conclusion is that papers are getting less disruptive? Their entire idea is built around this “CD index”, and they don’t account for this little unimportant confounding factor called “the internet”? I searched the article text for “internet” and it came up zero. That’s like writing an article about how people are travelling longer distances than last century, but failing to address this little thing called “cars”.

    THIS got into Nature? Were the peer reviewers sleeping?
    • google translate didn’t exist, you were limited to the articles that had already been translated (only the most important ones), or you had to hunt down someone to translate for you (expensive/difficult) or you brushed up on your technical Russian/German skills which was an actual thing back then for scientists.

      In the late 1980s and early 1990s when I was in college most scientific degrees had a foreign language requirement, and it generally had to be a language that was useful in the field. This was specifically to make sure that most scientists had the ability to read important foreign-language papers in their area. For undergraduate degrees the choice of language was mostly a strong suggestion, but for graduate degrees it was a serious requirement.

      I was a math major and I happen to read/speak Spanish. My Sp

      • In the late 1980s and early 1990s when I was in college most scientific degrees had a foreign language requirement

        For me, in the UK system, the foreign language requirement was applied at the university level - most universities applied the test when you were applying to them, before the point at which you applied to join a particular faculty let alone a department or a course.

    • Thank You!

      This is a good example of how (sadly) important salesmanship is to scientific acclaim. People fail to distinguish the facts (citation patterns observed) from the narrative brushed on to it.

    • THIS got into Nature? Were the peer reviewers sleeping?

      No offense, but I'll take those blind reviewers over your speculation based on a blurb about the paper and a search for Internet.

      The abstract says "We find that the observed declines are unlikely to be driven by changes in the quality of published science, citation practices or field-specific factors", so it looks like they attempted to control for such factors.

  • New technology enables new science. New science enables new technology.

    An example: X-ray crystallography, DNA, fast DNA sequencing, better understanding of proteins and evolution,...

  • ...have very likely declined, as the world has become more obsessed with exams, standardised testing, and standardised ability.

    You cannot produce breakthroughs unless you have non-standard thinkers with non-standard ability.

    Conformity won't get you great minds. Conformity will only give you brains suitable for production lines and menial work where creativity isn't required or useful.

    All progress depends on a well-educated non-conformist person.

  • We have a lot less Corporate funding of research than we previously did. Business is all about increasing profit each quarter, its about reducing the cost of the product and increasing prices on the buyer. Any investment that is not guaranteed to return a profit is eating into the quarterly profit game, and so it won't happen. We've completely lost sight of long term investments.

    https://www.science.org/doi/10... [science.org]

  • No. They just got harder to understand for the layperson. Higgs Bosun for example. Just about anything quantum, for many more examples, never mind that these things change our lives on a daily basis.

    • That is interesting in that quantum mechanics/quantum effects more-or-less undermine the traditional precepts of science, that is, as a statistical/numerical process, you either abandon the concept that there is some underlying closed-form "truth", or you come up with a wild variety of "true" causes that all make the same predictions, and therefore cannot be distinguished from each other - loop quantum gravity, string theory, "this is a simulation and quantum effects are the min bit" etc.

      Ther

      • It only appears so because we can't make it work with gravity. The quantum theories of gravity such as string theory and loop quantum gravity are either impossible with current technology to test, or, in the case of LQG, thus far have not really born fruit (space time doesn't appear to be made up of discrete units, but rather appears smooth any way we look at it). The basic concepts of QM have been around since Einstein's work on light, and in other applications, such as weak interactions that lead to radio

      • Nobody serious questions quantum mechanics these days, except out of ignorance. At the same time, nobody seriously questions that quantum mechanics is incomplete. It does not explain why particles have the masses they do, for example, or why there are exactly three generations of quarks and leptons.

    • With physics, other than advances in material research, we've also hit a bit of a holding pattern because to pierce the veil further is going to require ever more powerful instruments; which generally means a lot more energy. There are hints of new physics to be fond; we know the Standard Model isn't complete, we know that gravity still sits uncomfortably outside of modern physics; a classical hold over that thus far refuses to play nice with quantum mechanics. But the big advances are going to require a he

    • The bosun is in charge of the equipment, whether their name is Higgs or not.

      The Higgs Boson, though... Fiendishly complicated thing if you have a Newtonian mind.

      • Fiendishly weird rather than fiendishly complicated. The mechanism is simple actually, though the mathematics to describe it is nightmarish (and beyond me, for now).

  • Let me suggest that disruptive science is not simply disappearing, it is being suppressed. By definition, disuptive science is change and for people whose careers are built around past science disruptive science is not a good thing. Imagine if someone demonstrated that the twin studies that your entire career is based on were wrong. Most science is funded by people with a financial interest in the outcome. There is no money to be made from a cure for cancer if you are in the business of treating it. Imagine
  • All of the low-hanging fruit has already been picked up. What is left is far, far more difficult, thus taking longer effort by many more people. The era of the solitary genius (Einstein) being able to radically upend a vast field on his/her own has been over for decades.
  • Or to increase our professors' impact rating. I lifted half my thesis from Wikipedia and got paper references from there, and those paper's references.
    Don't care. Why the fuck did I have to do a thesis? I already knew how to code when I was a kid. HTML is not rocket science, neither is CSS, JS, SQL or PHP.

  • What the authors of the January paper measured was a changing pattern in the way papers were cited. They created an index of disruptiveness that measured how much a finding marked a break with the past. A more disruptive paper would be cited by many future papers while previous papers in the same area would be cited less — presumably because they were rendered obsolete. This pattern, they found, has been on a decades-long decline.

    MAYBE what they discovered was that some scientists do a better job of promoting their papers than others.

    It's really hard to see how such a change in citation patterns equates to a decline in "breakthroughs."

Make sure your code does nothing gracefully.

Working...