Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Science

Nobel Prize in Physics Goes To Machine Learning Pioneers Hopfield and Hinton (nobelprize.org) 49

John J. Hopfield of Princeton University and Geoffrey E. Hinton of the University of Toronto were awarded the Nobel Prize in Physics on Tuesday for their groundbreaking work in machine learning. The Royal Swedish Academy of Sciences recognized the scientists for developing artificial neural networks capable of recognizing patterns in large data sets, laying the foundation for modern AI applications like facial recognition and language translation.

Hopfield, 91, created an associative memory system for storing and reconstructing data patterns. Hinton, 76, invented a method for autonomous data property identification. "This year's physics laureates' breakthroughs stand on the foundations of physical science," the Nobel Committee stated. "They have shown a completely new way for us to use computers to tackle many of society's challenges." The laureates will share the 11 million Swedish kronor ($1.1 million) prize.
This discussion has been archived. No new comments can be posted.

Nobel Prize in Physics Goes To Machine Learning Pioneers Hopfield and Hinton

Comments Filter:
  • Physics? (Score:5, Insightful)

    by Artem S. Tashkinov ( 764309 ) on Tuesday October 08, 2024 @06:45AM (#64847859) Homepage
    It's strange to me that this invention has been categorized as "physics", when it sounds like it has much more to do with mathematics or computer science.
    • I was thinking the sane thing. But they're trying to stay relevant and be hip. There have been other recent changes that only make sense because of social pressure and not scientific excellence.

      I could though defend them a bit in the sense that particle physics is stuck in a bit of a rut. String theory and loop quantum gravity have not gone anywhere. All the test results have been negative. Sabine Hossenfelder has been pointing this out for a while. So they're a bit desperate for something that is a genuine

      • by dskoll ( 99328 )

        The urban legend as to why there's no Nobel Prize for Mathematics is that Nobel's wife had an affair with a mathematician. However, seeing as Alfred Nobel was never married, this legend is extremely Urban...

      • Re: Physics? (Score:5, Interesting)

        by Rei ( 128717 ) on Tuesday October 08, 2024 @08:31AM (#64848123) Homepage

        That's not it at all. They've gone into great detail as to why it was awarded for physics [nobelprize.org].

        TL/DR: Hopman is a physicist. Like, his thesis literally was "A quantum-mechanical theory of the contribution of excitons to the complex dielectric constant of crystals". When he switched to neural networks, he did so from a physics basis: his model was based on energy minimization landscapes. Hinton expanded on that by bringing in statistical physics, originating from the work of Boltzmann - he literally called his initial network a "Boltzmann Machine" [wikipedia.org].:

        A Boltzmann machine (also called Sherrington–Kirkpatrick model with external field or stochastic Ising model), named after Ludwig Boltzmann is a stochastic spin-glass model with an external field, i.e., a Sherrington–Kirkpatrick model,[1] that is a stochastic Ising model. It is a statistical physics technique applied in the context of cognitive science.[2] It is also classified as a Markov random field.[3]

        Boltzmann machines are theoretically intriguing because of the locality and Hebbian nature of their training algorithm (being trained by Hebb's rule), and because of their parallelism and the resemblance of their dynamics to simple physical processes.

        Today, the field of physics is deeply dependent on the machine learning enabled by Hopman and Hinton to sift through the reems of data in many fields to find meaningful signals - including, for example, finding the Higgs Boson, running gravitational wave detectors, detecting exoplanets, and on and on. The quantities of data processed today in many fields, except in cases where signals can be trivially distinguished from noise by handwritten algorithms, are far too large to rely on just human analysis.

        • by Rei ( 128717 )

          Or if the TL/DR itself was TL/DR:

          Physics gave birth to machine learning, and now heavily relies on it for further discoveries.

          • Machine learning had many parents. Neural networks in the 90s was a collaboration across many fields. At least when I was in grad school the local neural nets teams were coming from physics, computer science, biology, neurology (via the medical school), electrical engineering, and cognitive science departments.

          • by GoTeam ( 5042081 )
            It's all a conspiracy perpetrated by "Big Nobel"
        • This is not physics. This a math. The idea may be borrowed from physics but it is math. Physics is math but math is not physics.

          Sabine also agrees:
          https://www.youtube.com/watch?... [youtube.com]

          It looks like since there is no Nobel for Math, the Nobel Committee had to hack the Nobel. They already did this with the "Nobel for Economics".

          • by SirSlud ( 67381 )

            Wow, the author of Lost in Math: How Beauty Leads Physics Astray has an opinion about whether this is math or physics? Amazing.

          • Computation is physics. If inventors of telescope could be awarded prize, and I think they could, any AI that helps physics research similarly can be awarded prize. The thing is both are physical proxies to gain information about the universe.

        • by kwerle ( 39371 )

          Thanks for the concise summary! (I don't have mod points or I'd have used 'em)

        • This. My guess is the secret at the end of the long AI road is that when the human brain is perfectly understood, the shocking truth will be that none of it is special in any way whatsoever. It's just following a set of dynamics that can and do arise all over the physical universe.

          The thing that strikes me in studying the attention mechanisms that make the LLMs work, is that there is no satisfying explanation, there is this meta explanation, training the query, key and value matrices separately, it just wor

          • by Rei ( 128717 )

            One thing Hinton has talked about is that what we're building is not us, but a sort of alien intelligence - something that is very much an "intelligence", and has hallmarks of us, but also differences. For example, it's commonly talked about how our models take more data to learn than we do, as we "mull over" new info (though models kind of can do that by training on synthetic data as well, but this really should be done simultaneous with new data acquistion. Or how we can iterate over tasks (though so ca

        • Just because physics and artificial neural nets use some of the same math doesn't make them the same kind of field of intellectual pursuit.

          This prize award is a reflection of two things:

          1) Computer (or Computing, or Information) Science still can't get no respect in its own right. (Imitation is the sincerest form of flattery I guess.)

          2) Physics is a dumpster fire of unverifiable theories these days, due to having reached beyond the size and time bounds of feasible measurability.

          I feel that physics will have
        • Their argument is far from convincing, as shown by the obvious "whaaaaaat?" you get from physicists, mathematicians, computer scientists AND chemists.

          Firstly, neither Hopfield nor Hinton originated the architectural insights which lie at the foundation of neural networks. If the goal was to reward important foundational work, then the Nobel committee should have rewarded the inventors of the perceptron [wikipedia.org] while they were still alive.

          If the "physics connection" through ensembles was the goal, then there ar

      • Hinton was awarded the Turing Award in 2018, which is "The Nobel Prize of Computer Science".

    • by Tx ( 96709 )

      The Nobel Prize categories are physics, chemistry, physiology or medicine, literature and peace, as laid out by Alfred Nobel himself, plus the economics prize added in his memory. Short of going against Alfred Nobel's will and adding new categories, it seems sensible of the Nobel Committee to be somewhat flexible when there are eminently deserving nominees whose work doesn't quite fit into any of the existing ones. Hopfield and Hinton's work may not be traditional physics, but it is rooted in statistical ph

      • by JBMcB ( 73720 )

        Short of going against Alfred Nobel's will and adding new categories, it seems sensible of the Nobel Committee to be somewhat flexible when there are eminently deserving nominees whose work doesn't quite fit into any of the existing ones.

        Nobel wanted to give an award specifically for physics, so torturing the definition of physics to give awards to those in other fields isn't going against his wishes? There already well respected awards for computer science and math, why does the Nobel committee need to include those categories into physics as well?

        • by neoRUR ( 674398 )

          If it wasn't for Physics, then Hinton and Hopfield would not have gone from Physic to Neural Networks, as they used lots of physics ideas in it.

    • Ellen Moons, a member of the Nobel committee at the Royal Swedish Academy of Sciences, said the two laureates "used fundamental concepts from statistical physics to design artificial neural networks that function as associative memories and find patterns in large data sets."

      from https://phys.org/news/2024-10-... [phys.org]

      An interesting article regardless of what you think.

    • by Okian Warrior ( 537106 ) on Tuesday October 08, 2024 @09:10AM (#64848227) Homepage Journal

      It's strange to me that this invention has been categorized as "physics", when it sounds like it has much more to do with mathematics or computer science.

      I'm of the opinion that AI is a subset of physics and not mathematics.

      AI is fundamentally finding patterns within noisy measurements and learning to make predictions.

      The relevant difference here is that AI is based on *measurements*, while mathematics is pure and not tied to anything physical. One could look at non-Euclidean geometry as a good example: space could be curved like a sphere, or space could be curved like a saddle, or space could be not curved (flat). All three lead to interesting results, only one will be correct in our actual universe.

      Real world measurements have to deal with noise and other measurement artifacts, and as a researcher you need to keep that clearly in mind. As an example, NeuraLink data is scaled so that the data comes out with maximum dynamic range within the WAV format. All well and good, except that WAV samples are integers, and if you multiply your sample measurements by, say, 10.3 to get maximum range you introduce quantization noise in your data that didn't originally exist. The result integers have to be rounded up or down when converted to integer.

      In pure math this isn't a problem because you can multiply your inputs by anything because you're using real numbers.

      ChatGPT has to deal with misspellings and bad grammer, and still figure out what the user actually meant. Such as you just did with my previous sentance, when I misspelled "grammar". Or "sentence".

      There's some actual math involved in AI as well, but there's actual math in Chemistry and Biology. The actual math is fairly simple in current models, although the results are generally hard to predict, and the research seems to be more experimental than deductive. People set up new situations and "try it" to see if that works, if that works better than current solutions, and so on. There are metrics and current standards to test on (recognition accuracy of postal zip codes, for example).

      For example, the learning rate (scale factor for back propagated errors) for ANN's has to be "tuned" to a proper value to avoid well known problems. That's more of a "we do this because it works" thing rather than a "we can predict this is the correct value to use" thing.

      Because AI inherently comes from measurements of the physical world, and because of its experimental nature, I'm of the opinion that AI should be considered physics and not math.

      • by GoRK ( 10018 )

        AI is more like statistical mechanics which is ... well let's just say it started as very clearly not physics and the deeper we dig the more it seems that it is basically the entirety of physics.

    • by BigFire ( 13822 )

      Nobel hates mathematician. Which is why there's a Fields Award just for mathematician.

    • by guruevi ( 827432 )

      When the price was invented there was no computer science to speak of as a branch. Physics is applied mathematics, the Nobel prize is intended for practical inventions and progress.

    • Yes, this confused me too. Turing Prize, sure, as his work is now standard computer science and electrical engineering. This is essentially simulated biology using math. It's great work deserving of recognition, but... physics?

      Though I can see some who claim that physics is merely applied mathematics, and everything else in science is merely applied physics...

  • Sooner or later we will have to give them credit.
  • by itamblyn ( 867415 ) on Tuesday October 08, 2024 @07:13AM (#64847909) Homepage
    Given the impact that neural networks have had on physics research in the past few years, I can see the logic behind this.
  • by v1 ( 525388 ) on Tuesday October 08, 2024 @07:18AM (#64847917) Homepage Journal

    hah! that's the first network we studied in AI class waaaay back in the 90's. "Does it think or does it stink?" That was our mantra while developing.

    Training back then took absolute ages though and didn't produce results that were especially useful. Back then, computer IO and peripherals were still pretty limited. OCR was about the most useful application at the time. Certainly nothing with audio.

    Of course one of the bigger challenges with developing any neural network is to come up with an efficient, effective way to introduce data into the input layer. We just didn't have the computer power and memory needed for the broad input layers we can use today.

    But that's way in my past now, I'll gladly let the new generation of geeks take on the new challenges. I assume they've found better structures than the Hopfield Net to process broad input sets!

  • 91 years old ? And in a field that's sort of bleeding edge?

    • AI is not bleeding edge at all.
      Vast majority of foundational concepts in AI - and ML, which used to be not exactly the same thing - are 50 years old.
      They simply missed the raw computing power to apply those ideas in meaningful ways.

      PS: Not to say that nothing happened in 50 years of course, but the field is way older than people usually recognises.

    • by gweihir ( 88907 )

      AI is not "bleeding edge". Most of what is used today I learned 35 years ago when studying CS. Sure, there have been performance and size improvements and the actual language pre- and post-processing (which is not AI) is impressive. But the actual capabilities of the current models are nothing special. Even "hallucinations" are an old observation.

      • Ya I forgot that and stand corrected.
        I remember reading the tic-tac-toe "self learning" algorithm using matchboxes (not computers) originally from early 1900s it seems.
        Still, at 90 i would probably not even be able hold my phone leave doing any noble prize level AI shit.
        Whom am i kidding, i'll be dead at 60 with my current lifestyle

  • Probably what happens to an ageing Physicist, many of them start to hallucinate "deep truths". Does not diminish his earlier accomplishments though.

  • Should be awarded to the LLMs, they are so creative in anything written
  • Maybe five were for inventing gizmos (gas valve, fiber optics, CCD, various kinds of microscopes) and the rest were for discoveries about the structure of matter and energy. Even the gizmos were kind of about harnessing new discoveries about the structure of matter and energy, or for better probing the structure of matter and energy.

    AI isn't about discovery and testing that insight against reality. It's about regurgitation and hallucination. The opposite of physics. These guys do not belong on the same lis

  • Thank God I didn't major in Physics. I was close to doing it, I think I had watched a few episodes of Cosmos and thus fancied myself getting into astrophysics. Luckily someone told me "astrophysics" had no jobs, and I could do a lot as a hobbyist instead while I made money with something else (well that was BS). Anyway, back to my point .. Physics is dead, because the Standard Model solved everything, there's nothing else to award. Everything left is wild untestable speculation (String Theory) or chemistry

Time is the most valuable thing a man can spend. -- Theophrastus

Working...