Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Science

1999 Nobel Science Prizes Announced 121

Andrew Childs writes "The 1999 Nobel Prizes in the sciences have been announced. The physics prize goes to Gerardus 't Hooft and Martinus Veltman "for elucidating the quantum structure of electroweak interactions in physics." The chemistry prize goes to Ahmed Zewail "for his studies of the transition states of chemical reactions using femtosecond spectroscopy." And the prize in physiology or medicine goes to Gunter Blobel "for the discovery that proteins have intrinsic signals that govern their transport and localization in the cell." "
This discussion has been archived. No new comments can be posted.

1999 Nobel Science Prizes Announced

Comments Filter:
  • by Anonymous Coward

    "Hope the legislators are embarassed..."



    They would be, if they knew their shame. Not very
    familiar with Oklahoma rednecks, are you? They
    don't KNOW their level of ignorance, and have no
    sense of irony. And I doubt you've written them
    (as if they can read) to let them know there's something they should be ashamed of.

  • by Anonymous Coward
    Everyone on /. seems to be suggesting their should be a computing prize. I'm going to have go against that. Computer Science is really just an application of mathematics. There isn't a prize for math, so one for computer science doesnt make sense. In addition, I think people on /. would think that Larry Wall (perl author, I forget his last name) or Linus should get an award. This would be like saying well My Electric Company should get the award in physics because tehy created a Nuclear Reactor (e.g. something thats been done with really no new ideas). If there were to be a computer science award, I would guess it would go to people who designed new algorithms, such as RSA, efficient heaping techniques, or whatever. These algorithms are mostly mathematically based, so it wouldn't make sense to have the award without a math category.
  • by Anonymous Coward
    Well,... Alfred Nobel set up the prizes for the fields which he deemed important. If there should be a computing prize, then a mathematics should proceed that. How many of the things done these days can be accomplished without mathematics? Computing is based fundamentally on mathematics.

    On another note, one should keep in mind that when the Nobel prizes were first established, the things applied to a much larger scale. For instance, prizes were given to the study of the atom. With a better understanding of the atom, the fields of chemistry and biology were advanced. With advancements in biology, medicine progressed. Zewail's research could potentially allow for much progress to be made in biology.

    One should remember the following when dealing with the sciences... "biology becomes chemistry, chemistry becomes physics, and physics becomes mathematics." And college student can tell you that...
  • by Wakko Warner ( 324 ) on Tuesday October 12, 1999 @04:53PM (#1619414) Homepage Journal
    It's already been done... Just open any history
    book to a pre-1970's section and start reading.


    It used to suck.


    Did it? Sure, there were other things to worry about then, but I maintain that the human condition was in a much better state back then than it is now. Computers only seem to get in the way most of the time. Sure, you can make a small exception in the field of scientific research, but think about all the other things we've lost due to the advent of the personal computer.


    People are less productive at work. (Studies have proven this again and again.)


    The government finds it much easier to maintain huge databases of people and their habits. It's a simple matter nowadays to find out any information you want about anyone -- something that was out of reach for anyone in the past who couldn't afford a private inspector is now as simple as entering your credit card and waiting for a tidy report of your neighbors' dirty laundry to arrive in your electronic mailbox. We have less privacy than before.


    Think of how much time you waste wrestling with your computer every day, every week, every month. Add it all up. Some day, you'll want that time back.


    So science has grown us a healthier tomato plant and given us the ability to maintain erections well into our sixties, and computers are a part of that. Small contribution compared to the rather enormous chunks of life and liberty they've usurped from us.


    Has your quality of life really improved?


    - A.P.
    --


    "One World, one Web, one Program" - Microsoft promotional ad

  • The name of the man is Gödel, or if you are umlaut-challenged, Goedel, and his incompletness result implies nothing of what you say. What is says is than for any formal system that is at least powerful enough to describe elementary number theory, there exists sentences ("theorems") that are true, but not provably true.

    Not only did you get your assumpions wrong (I can easily devise formal systems that are complete and yet quite useful), the complete assertion makes little sense in this context.

  • The Ig Nobels are the prizes given by the Annals of Improbable Research for research that "cannot or should not be reproduced" You can't talk about the nobels without talking about the igs. The link is here [improbable.com]
  • Note that featured links to the announcement site of the nobel prize commitee does not only announce but also offers some nice explanations of the work done!

    Also interesting is that today we can click to the homepages of the prize winners. Again look at the announcement page. Gerardus 't Hooft having the most interesting home page one of them.

  • As good as Linux is, and as good as Linus is for the computing community as a whole, I don't quite think he deserves a Nobel Prize, yet... ;)

    Do they even have a Nobel Prize for *anything* involving computers?

  • by JoeBuck ( 7947 ) on Tuesday October 12, 1999 @05:39PM (#1619419) Homepage

    The closest thing to a Nobel in computer science is the Turing Award [acm.org], given by the ACM.

    For mathematics, the closest equivalent is the Fields medal [elib.zib.de].

  • Sadly, I think they care more about Holier-than-thou posturing and communtiy standards than Nobel Prizes.

  • The history of science is strewn with theories that come up with plausible mathematical models that turn out to be wrong experimentally. You might be able to derive chemistry from physics, but you cannot derive physics from mathematics. Math is a useful tool, but it is only a tool.
  • by Axe ( 11122 )
    ...so ther will be no 5th annual Nobel Prize party at Stanford Phys Dpt. Sucks. Free food was good last 4 years... :)
  • Yup, but if you don't have an umlaut on your keyboard the closest spelling is Guenter.

    /peter
  • I love computers as much as the best nerd, but no way they're worthly of a Nobel. For one thing, pretty much everything in computing is profit-oriented except for OSS, and for another, it doesn't really improve the human condition or anything.
  • There's also a quasi-Nobel in biology, which Seymour Benzer has won, among others.
  • If the Nobels were awarded solely based on the quality of the work done by these individuals, then perhaps this math trick did not deserve to win. However, there is more to the story.

    The high-energy/particle physics community in Europe is currently focusing on using CERN to find the Higgs boson (a.k.a. the "God Particle"). But, to do that (or at least to beat the Americans to it), they need more funding. Two European high-energy/particle physicists winning the Nobel is just the thing to get CERN more money.
    ------
  • ...I honestly think you're over-estimating the knowledge of the general public.
  • by egnor ( 14038 ) on Tuesday October 12, 1999 @02:46PM (#1619428) Homepage
    The official site has a transcription of the section of Nobel's will [nobel.se] by which Nobel established the original prizes (physics, chemistry, physiology or medicine, literature, and peace). He asks that the prizes be awarded to those that "conferred the greatest benefit of mankind", but for whatever reason he chose to leave out mathematics, much of biology (though you can interpret "physiology" generously) and most kinds of engineering, among others.

    I'm sure biographers have had a wonderful time guessing what influences in his life led him to favor those particular five fields.

    In any case, Nobel himself specified it that way; you can't just add another prize for your favorite field. At best, you could try to establish another "memorial" prize, like the one for economics. This is probably good: if you could, everyone would be agitating for their favorite hero to get the coveted Nobel prize. And if they succeeded, then the prize wouldn't exactly be coveted any more...

  • Proteins interact with each other based on their shapes, and the attracting or repelling forces of their constituent parts. When they bind to each other, it works much like a lock and key (when they briefly mesh to facilitate some reaction, then break apart) or interlocking puzzle pieces (when they bind more permanently). Much more of a tactile than an optical event.

    As a theoretical chemist, I must object. The interaction of two molecules is a consequence of electrostatics and the quantum mechanical properties of the electron (e.g., exchange and correlation, etc.)

    To call these complex interaction "tactile" (while infinitely better than calling them optical) is just plain wrong.

    As far as the hardware vs. software issue, I don't think you can apply those paradigms here. The protiens a piece of DNA codes for is a function of structure, and the actual enzyme is far more complex than the protien encoded for. For example, once assembled, a protien may go through several chemical changes before it becomes "activated" (e.g., picking up metal ions or other chemical moieties). So one could say that the protien has a program too.

    Since one can call anything in a cell hardware or software to one degree or another, I don't think these analogies help to elucidate anything biological or physical in this case.

  • Uhh, it's a little more soundly based than that!

    Renormalization is a very powerful technique that applies across a broad swath of physics, not just with elementary particles. The supposed "math error" is a recognition that the mathematical theories are incomplete. For example, in QED a calculation of the energy of an electron as a point charged particle gives you an infinite number - that can't be right of course because the electron does not have infinite energy or mass. An examination of where the infinity comes from reveals that it involves stuff at very small distances (and very high particle energies) - the renormalization technique is then to remove the infinity by putting in a high energy or short-distance cutoff in some physically consistent manner. Renormalization is really an augmentation of a mathematically "pretty" theory with some rules for getting practical calculations out of it - and the results are indeed very practical - QED has been verified in some instances to 1 part in 10^12 or so, better than any other basic physical theory we have.

    The totally correct theory of course would not treat the electron (or photon) as a point particle but as some extended object (eg. a string). String theories in fact do resolve all these weird infinities, which is why they are so popular. Unfortunately, it's impossible to calculate just about anything practical with string theory (yet).
  • Proteins have always struck me (who've never taken an organic chem class) as programs, hardware and software in one...

    I think they're more hardware than software. Some proteins are enzymes, functioning in the regulation of vital chemical processes, others are structural, and that, to me, makes them more analogous to hardware.

    Hope this helps.

    If they must be software, I think proteins would be object code, while the genetic material (DNA for most of us) would be analogous to source code. Maybe all those introns (DNA segments that are copied in replication but never expressed) are comments, formatting, etc!

    Then again, maybe it's all object code, with introns as the logic of the program, while extrons are the data. By this analogy, the extrons are the DNA code that gets transcribed (expressed) into proteins, while the introns dictate under what conditions to transcribe them. Loops and comparisons and subroutines, oh my!

    ...possibly optical in nature...

    I'd guess tactile, rather than optical. For one, unless one is working at extremely high (and thus damaging/dangerous to fragile DNA) frequencies, electromagnetic radiation (light) is inadequate to resolve the minute details of proteins. They're just too small. That's why electron microscopes and x-ray crystallography and the like are required instead of just visible-light microscopes.

    Proteins interact with each other based on their shapes, and the attracting or repelling forces of their constituent parts. When they bind to each other, it works much like a lock and key (when they briefly mesh to facilitate some reaction, then break apart) or interlocking puzzle pieces (when they bind more permanently). Much more of a tactile than an optical event.

  • "I challenge you to describe the world without computing."

    It's already been done... Just open any history
    book to a pre-1970's section and start reading.

    It used to suck.

    -WW
  • "Computers only seem to get in the way most of the time."

    For you maybe, not for me.

    "think about all the other things we've lost due to the advent of the personal computer. "

    Now compare that to all that we've gained, and
    you'll see a lopsided victory in favor of
    computers. It's not even close.

    Perhaps you're one of those that just would like
    to get back to the "good old days"? I've heard
    this scenario before... the grass is always greener
    and all that.

    "People are less productive at work."

    You left out how computers have created entirely
    new fields of work, while utterly replacing the
    need for humans in others. It's give and take.

    Sure, right now computers are a distraction in
    SOME fields where they are not being used
    correctly, or are overkill for the job at hand,
    or the person is not responsible enough to handle
    having a computer, or....... you get the point.
    It's not the computer's fault that some people
    mis-use it. I bet the same could be said for many
    new inventions at the workplace.

    "something that was out of reach for anyone in the past who couldn't afford a private inspector is now as simple as entering your credit card and waiting for a tidy report of your neighbors' dirty laundry to arrive in your electronic mailbox."

    So you feel that only allowing the rich access to
    information is a good thing? Funny, I thought the
    notion of computers leveling the playing field was
    a good thing!

    "We have less privacy than before."

    So what? Privacy is traded all the time for other
    things. It's not "Here, take our privacy, and give
    us nothing in return." There's a REASON you're
    losing privacy.

    And if you really want to, you can take it back.
    Trade in your PC's, your radios, your CD players,
    your microwave, your gameboy and N64, your TV,
    and yes, even your car. Trade them all in, for
    they all use computers, and go move out into the
    wilderness, or some remote, foreign location.

    Now you have your privacy... are YOU happy with that?
    I wouldn't be.

    "Small contribution compared to the rather
    enormous chunks of life and liberty they've
    usurped from us."

    You're living in another reality. You should
    write a book about it and stick it in the sci-fi
    section. You might even make enough money to
    afford that privacy you're after.

    "Has your quality of life really improved?"

    Are you reading this response in the comfort of
    your own home or back yard, while sipping a fine
    wine, smoking a cigar, lying on your back with the
    blue skies above, and your kids playing in the pool?

    No, but you could be, because of a little device
    known as a FUCKING COMPUTER.

    So, I guess my answer would be, "Yes."

    -WW


    -WW
  • "All right... You all have spent so much time in front of an 18" screen, you can't see the big picture. 1) The term "Dark Ages" ...."

    And you have spent too much time studying history
    that you can't see my .sig is A JOKE. I stole it
    off a bumper sticker. Yes, we know correlation
    does not equal causation.

    Here's another one for ya. Maybe you can tell me
    the history behind it:

    "Jesus loves you.
    Everyone else thinks you're an asshole."

    -WW
  • "um uh last time i checked my microwave didnt have a computer"

    I didn't say it had a PC, I said it had a
    computer. A programmable electronic device that
    performs prescribed operations on data (and
    input/feedback) at high speed.

    "by the way any car with a computer in it sucks!!!!"

    Well, that would be any car made in the last
    10-15 years or so... They're all controlled by
    an on-board computer.

    "radio no need for a computer there radios have only gotten smaller not better with transistor and circut technology, the marconi wireless could cross the atlantic and thats about as far as one
    needs."

    I should have qualified: modern, electronic radios. Of course radio waves themselves only need
    power and amplification to be sent...

    "whats the quality of life for those who have no
    jobs or one which offers a paper hat instead of
    self respect because they lost thier jobs to a
    computer."

    I'm not trying to answer if computers have been
    good to EVERYONE ON EARTH. There are always people
    who come out on the bad side of everything. That
    does not mean computers haven't changed things for
    the good.

    Or do you miss the days before we had factories
    that can create products faster than men by
    themselves?

    " in short if youre going to make an arguement
    know what you are talking about and consider all
    sides of the issue before spouting off thank you"

    I have considered all sides of the argument. You,
    however, are ignorant on many points here.

    -WW
  • Cplus [slashdot.org] wrote
    I think we should have a /. prize to reward the perosn with the highest karma rating.

    Highest? Not a good idea as it benefits the very early subscribers who've had longer time to post. Instead I suggest something like signal to noise ratio which is the number of posts per week that is above the baseline. Then perhaps sort by volume. This way you would reward people who consistently get 4-5 using a few pithy and to the point comments but without continuously responding just to get their posts noticed and their karma up. For your information, in studies of organisational efficiency, people have always found simple metrics to be rather self-defeating as people orient to the artificial goal instead of the organisational objectives. Defining metrics for measuring people is inherently a politicised process. E.g. would you say that having a high IPO is a good measure of success in Silicon Valley? If so, does this encourage people to overhype technology giving the entire industry a bad name in the long term?

    Oh well, he (or she) who defines the rules, gets the reward.

    LL
  • Chalst [slashdot.org] wrote
    No mathematics prize either, and I believe that subject existed then...

    The mathematical equivalent is the Fields Medal [cs.unb.ca]. The story is that Nobel had a bit of a ... ummm ... personal disagreement with a prominant mathematician of the day so deliberately left out mathematics to prevent his rival from gaining any kudos.

    As for other posters wanting something similar for computing, I would instead suggest that a computer language which is widely adopted and solves a significant class of problems would be a better choice. Afterwall, what is a language but a systematic way of ennunciating the concepts for a general problem domain? In this way, the greatest mark of respect for Perl and (to some extent) Python has been their rapid adoption by peer programmers. To paraphrase ESR, show them the code and reap the kudos.

    LL
  • 't Hooft has a little FAQ on 'Can Theoretical Physics explain paranormal phenomena?' find it at http://www.phys.uu.nl/~thooft/para.html [phys.uu.nl] . also, check out his PostScript [phys.uu.nl] pictures, he even has one of a 'living black hole'!

    had two classes with this guy, so i am excited to finally see him get what he deserved a long time ago.

    patrick.
  • I believe the spelling in Guenter Blobel.
  • No it wasn't.

    Some overzealous law enforcement official nabed a guy claiming the film was child porn. The case was thrown out of court. Plus the guy got damanges for being run through the system Thats a long way from OK banning the film.

    Granted, still pretty embarassing...
  • A + infinity = B + another infinity

    Therefore A = B.


    Yeah, but what about the fact that "infinity" might be "bigger" or "smaller" than the "other infinity" :-)
  • There's a nice anecdote on why there's no Nobel award for mathematicians -
    The legend says that Nobel wife/SO betrayed him with a matematician...

    The reason why there's no Nobel award for CS is more obviouse... when the Nobel fund opened there was no CS.

  • If he was still alive, I'm sure that Feynman would actually have made quite an interesting TV award recipient! :-) OTOH he'd probably have skipped the ceremony, and been reinventing physics at some topless bar, with Hooft & Veltman's work having been some unpublished margin notes next to his Feynman diagrams!
  • Yes, I've read it. :-)

    There's a new Feynman book just out, based on transcripts of his own words: The Pleasure of Finding Things Out [amazon.com].

    As you say, one of the most fascinating, and also most brilliant minds of this century. If I was selecting a "great figures of history" dinner party, Feyman would definitely be at the head of the table!


  • Heck, he could have done a musical number on the bongos....heh.

    If you haven't read it, take a look at Genius by James Gleick. An excellent supplement to Feynman's own writings. He's truly one of the most fascinating folks of the 20th century.
  • The Slashdot Prize should be your nickname added in a derogatory fashion to the bottom of every poll. Heh.

  • It's nice to see a Nobel prize awarded for something we all learned NOT to do in elementary school! d'Hooft used a mathematical trick called "renormalization" to allow the Standard Model -- QED (quantum electrodynamics, i.e. electrons) plus QCD (quantum chromodynamics, i.e., quarks) -- to produce the "correct" results instead of nonsense (infinite) answers. Unfortunately, renormalization consists of a somewhat more sophisticated version of the following equations:

    A/0 = B/0 {yes, that's dividing both sides by zero}

    Therefore A = B.

    Needless to say, this technique can, properly applied, solve ALL problems. Physicists are exceedingly unhappy about having to renormalize QED/QCD (which generated it's own set of Nobels for people like Feynman, Weinberg, Glashow, and Salam, among others) and it's widely felt this means the Standard Model is not the last word, even though it gives exquisitely accurate predictions that have been been subsequently borne out in the real world. Unfortunately, trying to apply the same techniques to a quantum theory of gravitation leads to infinities that cannot be renormalized (so far, anyway), so hopes of a TOE (theory of everything) are still nebulous. Nevertheless, a whole bunch of theoretical physicists are devoting their lives to that holy grail and its Nobel.

  • funny, simple, and untrue.

    -
    /. is like a steer's horns, a point here, a point there and a lot of bull in between.
  • The story I heard was that Nobel's wife was involved in an affair with a young mathematician who would have stood a good chance of winning the prize for mathematics, had there been one...

    Thats a myth. Entirely untrue, but for some reason it comes up EVERY time this subject is discussed.



    -
    /. is like a steer's horns, a point here, a point there and a lot of bull in between.
  • Hmm. Personally, I think it should go to someone dealing with the science of networking, which has allowed the Net to keep on even with geometric increases in bandwidth, users, nodes, and loads. As opposed to those who make atomic-size miniature pianos.

    But I doubt it will happen. Some of those networking advances use some pretty hefty math.

  • However, mathematicians can, and frequently do, win the Nobel prizes for economics and physics. The idea is that math itself does not "confer [a great] benefit on mankind", but the things you do with it might. Presumably computer science would work the same way. So, there's no reason why a computer scientist couldn't win the prize for practically any of fields, although literature, for one, seems a bit unlikely. (Start writing those Perl poems now!)


    -r

  • There is a "funny" and simple explanation why there is not math Nobel prize : the wife of Nobel had an affair with a mathematician...
  • If anyone's interested, Caltech [caltech.edu], the home of Professor Zewail, the chem winner, has a press release up as well as video from a press conference from earlier today.


    Of course, I'll have to insert a Go Caltech! here (so I hadn't even heard of the place when he did his research in the late 80's... I'm here now.)
  • You cannot build a computer wihtout using physics! Enough said!!!
  • No mathematics prize either, and I believe that subject existed then...
  • The film adaptation of his most famous novel `The Tin Drum' was outlawed in Oklahoma until last year as child pornography.

    Hope the legislators are embarassed...
  • You'd miss... it's an awfully small target..
  • The reason why there's no Nobel award for CS is more obviouse... when the Nobel fund opened there was no CS.

    So what? Nobel didn't establish the Economics prize either; it was started in 1968 to celebrate the 300th anniversary of the Swedish central bank. If there were enough call for a CS prize, they'd find a way to create one.
  • by Lucius Lucanius ( 61758 ) on Tuesday October 12, 1999 @06:20PM (#1619460)
    To all the people asking why there is no nobel in math of comp. sci....there have been many people asking the same question about mechanical engineering, civil engineering, aeronautics, etc., for years. Keep in mind that the Nobel prizes were awarded more than 100 years ago, so these fields were very valuable and saved millions of lives many decades ago, and yet did not merit a Nobel.

    The answer is in the intent and purpose of the prize. Also, the nobel in Economics arrived recently, in the 1960s, and is titled

    "The Sveriges Riksbank (Bank of Sweden) Prize in Economic Sciences in Memory of Alfred Nobel." It is in many ways different from the "original prizes".

    Check out the history for the answers, and as to why there is no Nobel in mathematics.

    http://www.lib.lsu.edu/sci/chem/guides/srs118_hi story.html

    L.


  • Well, I don't know about that. There are very few that I think would deserve a "Nobel Prize in Computing". Alan Turing. Claude Shannon. Uh....

    Most that happens in this field isn't Nobel calibre stuff. We are in an evolutionary, not revolutionary field.

    Anyway, certainly not Linus as Linux, though a revolution organizationally, is not particularly revolutionary technically. It is a very good OS, but at its root, it is merely a clone of something else. You don't get Nobels for copying someone else's work.

    I'm not trying to diss anyone here. Linux is a great thing, but what is great about it is the method with which it was created, not the OS itself.
  • Actually the 'whole point' of the Nobel prize was
    so that Nobel could be sure that when he died
    he would be known as more that the dude that invented dynamite.

    Just providing your daily dose of cynicism.
  • From the page -- Günter Blobel.

  • Although Linus has obviously contributed immensly to the world of PCs, the person that has been able to surpass the "nerdy" stereotype of Linux operating systems is Mandrake. In my opinion, Mandrake will be more influential in the surge of domestic Linux users in the world due to the fact that average users can now install and run what was once a nightmare.
  • PHYSICS NEWS UPDATE
    The American Institute of Physics Bulletin of Physics News
    Number 452 October 12, 1999 by Phillip F. Schewe and Ben Stein

    THE 1999 NOBEL PRIZE FOR PHYSICS goes to Gerardus 't
    Hooft of the University of Utrecht and Martinus Veltman, formerly
    of the University of Michigan and now retired, for their work
    toward deriving a unified framework for all the physical forces.
    Their efforts, part of a tradition going back to the nineteenth
    century, centers around the search for underlying similarities or
    symmetries among disparate phenomena, and the formulation of
    these relations in a complex but elegant mathematical language. A
    past example would be James Clerk Maxwell's demonstration that
    electricity and magnetism are two aspects of a single electro-
    magnetic force.
    Naturally this unification enterprise has met with various
    obstacles along the way. In this century quantum mechanics was
    combined with special relativity, resulting in quantum field theory.
    This theory successfully explained many phenomena, such as how
    particles could be created or annihilated or how unstable particles
    decay, but it also seemed to predict, nonsensically, that the
    likelihood for certain interactions could be infinitely large.
    Richard Feynman, along with Julian Schwinger and Sin-Itiro
    Tomonaga, tamed these infinities by redefining the mass and charge
    of the electron in a process called renormalization. Their theory,
    quantum electrodynamics (QED), is the most precise theory known,
    and it serves as a prototype for other gauge theories (theories which
    show how forces arise from underlying symmetries), such as the
    electroweak theory, which assimilates the electromagnetic and weak
    nuclear forces into a single model.
    But the electroweak model too was vulnerable to infinities and
    physicists were worried that the theory would be useless. Then 't
    Hooft and Veltman overcame the difficulty (and the anxiety)
    through a renormalization comparable to Feynman's. To draw out
    the distinctiveness of Veltman's and 't Hooft's work further, one
    can say that they succeeded in renormalizing a non-Abelian gauge
    theory, whereas Feynman had renormalized an Abelian gauge theory
    (quantum electrodynamics). What does this mean? A mathematical
    function (such as the quantum field representing a particle's
    whereabouts) is invariant under a transformation (such as a shift in
    the phase of the field) if it remains the same after the transformation.
    One can consider the effect of two such transformations, A and B.
    An Abelian theory is one in which the effect of applying A and then
    B is the same as applying B first and then A. A non-Abelian theory
    is one in which the order for applying A and B does make a
    difference. Getting the non-Abelian electroweak model to work was
    a formidable theoretical problem.
    An essential ingredient in this scheme was the existence of
    another particle, the Higgs boson (named for Peter Higgs), whose
    role (in a behind-the-scenes capacity) is to confer mass upon many
    of the known particles. For example, interactions between the Higgs
    boson and the various force-carrying particles result in the W and Z
    bosons (carriers of the weak force) being massive (with masses of
    80 and 91 GeV, respectively) but the photon (carrier of the
    electromagnetic force) remaining massless.
    With Veltman's and 't Hooft's theoretical machinery in hand,
    physicists could more reliably estimate the masses of the W and Z,
    as well as produce at least a crude guide as to the likely mass of the
    top quark. (Mass estimates for exotic particles are of billion-dollar
    importance if Congress, say, is trying to decide whether or not to
    build an accelerator designed to discover that particle.) Happily,
    the W, Z, and top quark were subsequently created and detected in
    high energy collision experiments, and the Higgs boson is now itself
    an important quarry at places like Fermilab's Tevatron and CERN's
    Large Hadron Collider, under construction in Geneva.
    (Recommended reading: 't Hooft, Scientific American, June
    1980, excellent article on gauge theories in general; Veltman,
    Scientific American, November 1986, Higgs bosons. More
    information is available at the Swedish Academy website:
    http://www.nobel.se/announcement-99/physics99.ht ml)

    THE 1999 NOBEL PRIZE IN CHEMISTRY goes to Ahmed H.
    Zewail of Caltech, for developing a technique that enables scientists
    to watch the extremely rapid middle stages of a chemical reaction.
    Relying on ultra-fast laser pulses, "femtosecond spectroscopy" can
    provide snapshots far faster than any camera--it can capture the
    motions of atoms within molecules in the time scale of
    femtoseconds (10^-15 s).
    An atom in a molecule typically performs a single vibration in
    just 10-100 femtoseconds, so this technique is fast enough to discern
    each and every step of any known chemical reaction. Shining pairs
    of femtosecond laser pulses on molecules (the first to initiate a
    reaction and the second to probe it) and studying what type of light
    they absorb yields information on the atoms' positions within the
    molecules at every step of a chemical reaction. With this technique,
    Zewail and his colleagues first studied (in the late 1980s) a 200-
    femtosecond disintegration of iodocyanide (ICN-->I+CN),
    observing the precise moment at which a chemical bond between
    iodine and carbon was about to break.
    Since then, femtochemistry has revealed a whole new class of
    intermediate chemical compounds that exist less than a trillionth of a
    second between the beginning and end of a reaction. It has also
    provided a way for controlling the courses of chemical reaction and
    developing desirable new materials for electronics. It has provided
    insights on the dissolving of liquids, corrosion and catalysis on
    surfaces (see Physics Today, October 1999, p. 19); and the
    molecular-level details of how chlorophyll molecules can efficiently
    convert sunlight into useable energy for plants during the process of
    photosynthesis. (Official announcement and further info at
    http://www.nobel.se/announcement-99/chemistry99. html; see also
    Scientific American, December 1990.)
  • wow, this is the first I've heard about this. Proteins have always struck me (who've never taken an organic chem class) as programs, hardware and software in one, possibly optical in nature- at least analog.
    Does anyone informed care to "illucidate"?
  • Don't improve the human condition? How does the better understanding of electoweak forces benefit humanity? Why can't something profitable benefit humanity?

    Are you seriously telling me that life would be the same, or better, without computing?

    I challenge you to describe the world without computing.

    Peter Pawlowski
  • Nobel was a practical man. He himself devised dynamite. He wanted people to be rewarded for things that had direct application to the real world. As much as you might like mathematics, very rarely does it have a direct application to the real world, unless this application is done through Physics, Chemistry or Medicine(which get prizes). Computing is a very practical field. Its applications usually have a very direct and tangible usage. Therefore the field deserves a prize.

    Peter Pawlowski
  • Nobel wanted fields that were practical. As you said, Math is the root. Physics, Chemistry, and Biology spring from Math. However, in itself, Math doesn't have an application, it is pure abstracts. The whole point of the Nobel prize is to award the best 'applicators'.

    Peter Pawlowski
  • If you're going to give a Nobel to Linus, as some are suggesting, you might as well give a Nobel to Gates and Jobs. Those people didn't do Nobel level stuff...

    As for your statement about computing being evolutionary, all Science is. You can't get Quarks without Atoms. You can't get Mitochondrea without cells. You can't multiplay before adding. All Science is evolutionary. However, I still believe that Computer Science is a field that deserves a stand-alone Nobel. Not much we can do about it, though.

    Peter Pawlowski
  • Around one hundred years ago, or whenever Alfred Nobel gave all his money to the Nobel Foundation, the computer was not forseen. I propose a new addition to the array of prizes, a prize for computing. Some of the most influential advances in technology happen in or because of the field of computing. Shouldn't the people who spend their lives bettering the field, which very directly affects science, get a Nobel prize as well?

    I say yes.

    Guess who I'd nominate first?

    Peter Pawlowski
  • Do they even have a Nobel Prize for *anything* involving computers?

    Just the application of computers in other Nobel fields. If somebody uses Linux to bring about world peace, they might send Linus to Stockholm. :)
  • Wow... I sorta thought Norway played some role in it, but figured I was just confusing the two. Thanks for pointing that out!
    r
  • I think we should have a /. prize to reward the perosn with the highest karma rating. The prize could be a licensed copy of linux (what?). We should also reward the person who gets the most first posts and says nothing with a good slap in the face.
  • What, I'm not first, well at least I didn't say anything meaningful and as a bonus had a mispelling. Always preview boys and girls.........but I can't read..
  • The femptosecond reaction work he has done is truely revolutionary. I'd like it to see some more features in a scientific american-ish view of the work of all the awardees. Does such a place exist?

    -- Moondog
  • um uh last time i checked my microwave didnt have a computer..... neither does my grandmothers 1950's amana radar range hmmmmm oh by the way any car with a computer in it sucks!!!! (unless its a laptop in the back seat)... radio no need for a computer there radios have only gotten smaller not better with transistor and circut technology, the marconi wireless could cross the atlantic and thats about as far as one needs. and i do also beleave that while some tv's use computer chips they are not necessary, besides whats worth watching on tv anyway? pc and n64 well they are just computers anyway, i would much rather have my privacy anyway. oh and quality of life... whats the quality of life for those who have no jobs or one which offers a paper hat instead of self respect because they lost thier jobs to a computer. in short if youre going to make an arguement know what you are talking about and consider all sides of the issue before spouting off thank you
  • um uh last time i checked my microwave didnt have a computer..... neither does my grandmothers 1950's amana radar range hmmmmm oh by the way any car with a computer in it sucks!!!! (unless its a laptop in the back seat) ... radio no need for a computer there radios have only gotten smaller not better with transistor and circut technology, the marconi wireless could cross the atlantic and thats about as far as one needs. and i do also beleave that while some tv's use computer chips they are not necessary, besides whats worth watching on tv anyway? pc and n64 well they are just computers anyway, i would much rather have my privacy anyway. oh and quality of life... whats the quality of life for those who have no jobs or one which offers a paper hat instead of self respect because they lost thier jobs to a computer. in short if youre going to make an arguement know what you are talking about and consider all sides of the issue before spouting off thank you
  • Before you mark me down as flamebait, think of the statement you just made.

    The vendor was doing his job; with this logic, you could also say Bill Gates should win the Nobel prize due to the fact "that average users can now install and run what was once a nightmare." Still is a nightmare, IMHO, but Linux isn't out of the woods yet.

    Anyway, Neither Linus nor Mandrake warrants the Nobel Peace Prize, that's not what it is there for, but if anything, Linus is the father and central developer/focal point of the Linux effort, obviously. Perhaps more fitting would be the founders of the open source movement. GNU Fathers, anyone?

    In any case, Nobel is more scientific theory affecting our interactions and undersanding of the world, not of computers.

    Computers are but a small part of life, my friends.

    (So there!)

  • Re: your comments:Shouldn't the people who spend their lives bettering the field, which very directly affects science, get a Nobel prize as well? I say yes. Guess who I'd nominate first?

    a Nobel Prise for CS . . ummm lemme guess..

    could it be ummmm . .

    parhaps the legendary . .

    the open source revolutionary . .

    one and only . .

    the Jimmy Kimal to my Ben Stein. .

    the master of disaster . . .

    king o' crackers . .

    that freaky phreaker . .

    a real geeks geek . .

    the churnin hunk o' burnin' funk . .

    better than bevis. . .

    doper than dilbert . .

    the king of all coders . .

    the big man! . .

    Malda!?

    ummm yea, ok...he doesn't suck ;)

  • Re: your comments: I would instead suggest that a computer language which is widely adopted and solves a significant class of problems would be a better choice

    Then Dennis Richie is the obvious choice. He invented C and used C to invent UNIX

    nuff said?

  • I've been sleeping a lot during Quantum Field Theory last year. But what I understood from their renormalization theory is that they ignored infinite components in the integrals since they were more or less fixed numbers.

    not:
    A/0 = B/0

    but:
    A + infinity = B + another infinity

    Therefore A = B.

    Just as tricky btw, but certainly not applicable to every other problem.

    Ivo
  • by Denor ( 89982 ) <denor@yahoo.com> on Tuesday October 12, 1999 @02:33PM (#1619483) Homepage
    Ahh, the Nobel Prizes. Now this is an event worthy of seeing. Of course, since I can't attend in person, I won't be able to. It'd be nice if they were televised:

    Announcer: "And the Nobel for Physics goes to...."
    (Cut to shot of nervous hopefuls)
    Announcer: "Gerardus 't Hooft and Martinus Veltman, for elucidating the quantum structure of electroweak interactions in physics!"
    (The duo look surprised and go up to the stage to claim their prize. The audience cheers happily, though those who were not nominees have no idea what the announcer said)
    Veltman: "What can I say? This is indeed a proud moment for us both. We'd like to thank everyone in our lab for helping us. And our families, for supporting us morally."
    Hooft: "And Elvis."
    (Veltman just gives Hooft an odd look)
    Announcer: "And there you have it. Coming up next, the nobel prize for Chemistry! Right after this word from our sponsor...."

  • Uh... you didn't get that quite right, because if he'd get the Nobel Prize for peace he should actually be sent to Oslo...

    (See http://www.nobel.se/awarding/ [nobel.se])

  • I thought I'd reply 'cause I like these silly metaphors.

    I guess you could look at it this way:

    DNA is like source code RNA are like the object files proteins are the compiled binaries.

    You could think of these signals as--I guess--equivalent to magic numbers in UNIX?

    The hardware/software question is kind of hard to tackle with this analogy, though--it kind of breaks down. I guess in an everyday sense, they are hardware--they are physical entities that move around and sometimes have structural functions and they function by physically interacting with other substances. But I think it's more fruitful to make them analogous to objects, in an OOP sense.

    Apparently the idea of signal sequences has been around for about 20 years.

  • Yeah, it is pretty simplified, but the Central Dogma isn't all the complicated, really--it was figuring it out that was a real bastard. What more do you need to know about the Central Dogma than DNA-->RNA-->protein? (OK, there's DNA replication and reverse transcriptase, but that's not much more)

    I don't know if it would really be that great if computers were just like life. When you consider it took about a billion years to get a basic cell, and another three billion to get the first multicellular organism--that's one hell of a wait for the next upgrade. Though I admit that we haven't made anything near the complexity of a single bacterium (but then again we haven't had a billion years!) Plus the cruft factor is incredible. The human genome is roughly 3 Gbp--but only about 120 Mbp actually codes for anything! The rest is evolutionary debris or just outright junk (except possibly for structural purposes, I guess) That would be kind of annoying, filling up a 3 GB hard drive, when all you really need is that 120 MB. Guess it's kind of like running Windows.

  • Hehe, it is not that easy. Prize in physics, for example, was given for what you call "fundamental research", i.e. research which does not have any immediate practical application (one cannot make better refrigerators based on quantum theory of electroweak interactions ;-) ) and thus is quite tricky to explain to a mere mortal. Even working in physics (in a different area though) I have only general idea, what is it about. I've been on the press-conference, where prize winners were announced and you should have seen poor journalists running around and asking desperate questions about "practical significance" of their theory, because "electroweak interaction", "quarks" and "leptons" are not the words to appear in the evening news ;-).
  • You assume, of course, that

    (1) they remember, and
    (2) they pay attention to who gets the Nobel Prize.

    I think (1 AND 2) is pretty unlikely.
  • see how your two molecules would react.
  • All right... You all have spent so much time in front of an 18" screen, you can't see the big picture. 1) The term "Dark Ages" refers to the purported lack of administration, peace, law, and order that prevailed in the aftermath of the "fall" of the Roman Empire. Most modern historians now agree that a) the Germanic tribes were not really a destabilizing force; they just weren't as bureaucratically minded as the Romans; and b) the Roman Catholic Church was not the stultifying force it had been made out to be. Also, regarding life before and after the computers, does anybody know (or care) what happened to the people whose livelihoods depended on whales when light came from coal, enrgy from steam, and warm weather gear from out west? The answer is a resounding NO! However, computers are different in one important way from ALL other technologies (Television being the transitional exception): computers provide us with an illusion of involvement and community, but only an illusion. For example, I don't know whom I have addressed, or how many people "heard" me; and you all don't know anything at all about me except the contents of this message. Thanks for listening Frank D
  • Other good books on Feynman are "Surely You're Joking, Mr. Feynman" and "What Do You Care What Other People Think?" These are sort of autobiographical. Close friends compiled his anecdotes as accurately as possible. These books were also the basis of "Infinity", a surprisingly good movie starring and directed by Matthew Broderick.
  • 't Hooft is also known for the 'Holographic Hypothesis'. This is the bizarre sounding idea that in some way space is 2d rather than 3d and may in fact have a lot of resemblances to a 2d cellular automaton like Conway's Game of Life. Strangely this isn't idle speculation but seems to be built into General Relativity from the beginning. I have a few notes and references at my web site [demon.co.uk]. The papers are a bit technical but the results are cool.
  • The story I heard was that Nobel's wife was involved in an affair with a young mathematician who would have stood a good chance of winning the prize for mathematics, had there been one...

    Anyway mathematics and computer science have their own well established prizes, so there isn't really a need for Nobels in those areas. Although of course, no other prizes come close to the Nobels in terms of public prestige -- the Nobels are probably the only acamdemic prizes that most of the public can even name you.

  • by JPMH ( 100614 )
    Ooops... the original mathematical demonstration of supersymmetry was Dimopolous, Fayet, Gol'fand and Lichtman, 1971; and the first mainstream supersymmetric particle model was Wess and Zumino, 1973.

    So, not 't Hooft... sorry.

  • Unfortunately, renormalization consists of a somewhat more sophisticated version of the following equations:

    A/0 = B/0 {yes, that's dividing both sides by zero}

    Therefore A = B.

    Or rather, a + A/0 = b + B/0, so a=b (Feynman, Schwinger, Tomonaga, 1947).

    But you could ask whether this is really so much worse than A.0 = k B.0 (Newton, Leibniz c.1680).

    In both cases, once you have found the right way to show that the cancellations work for all finite values, the limit starts to look plausible (and you can start isolating just what situations would break it). 't Hooft gave the fundamental proof that all gauge theories genuinely are renormalisable (including the electroweak theory and QCD).

    The important thing about renormalisation is that the problem isn't with the interactions, it's that the set of basis functions that you're using to expand space are getting more and more nearly orthogonal to reality. That means you end up with something rather like a very very ill-conditioned matrix to invert. (If you like, this is the O^(-1) (B-A)). You don't have to expand in such a bad basis, though. 't Hooft was able to show that if certain symmetry properties hold then all the nasties cancel, whatever the basis expansion.

    So it probably isn't still true that all physicists are "exceedingly unhappy" about renormalisation; but a better sequence of basis functions for thinking about small-scale reality would certainly be nice.

    't Hooft also found a quite unexpected mathematical gauge symmetry between bosons and fermions, which probably deserves a Nobel prize by itself. We still don't know whether the laws of nature have this supersymmetry or not, but the idea has fascinated theoretical physicists for 25 years, and is built in to superstrings in their very foundations.

    A thoroughly worthy winner.

  • That's actually what intrigued me about biology (my field of interest switched from computing to biochemistry for just this reason). The intriguing thing about biomolecules is that they must encode their purpose based on the physical properties of the molecules.
    Biosignals, unlike their computing equivalents, are not arbitrary. Instead they tend to be particular sequences of water-attracting and water-repelling amino acids which make certain things easier. The interactions are not optical, or even analog. They are physical in nature, making them extremely versitile, but also very complicated.
  • The story about Nobel's dispute with a mathematician is exactly that-a story. Nobel didn't feel that math was practical enough.
  • by BobC ( 101861 ) on Tuesday October 12, 1999 @02:03PM (#1619498)
    To avoid Slashdotting Sweden, the primary US mirror is http://nobel.sdsc.edu/announcement-99 , and the announcement is also mirrored at several SUNsites.
  • Actually, this is a good point. Far too many of us geek-types (by which I naturally mean without prejudice scientists and engineers) try to either justify our existence or prove our worth by throwing around jargon even in a general setting. "Intelligence by vocabularly," you might say.

    It's certainly true that there are concepts and procedures which are too complex to explain in layman's terms, but it's inexcusable not to make an effort to allow people without a PhD to understand your work.

    Take the case of the Chemistry Nobel that Zewail won; the official press release describes his work as analagous to creating slow-motion film of a chemical reaction using the world's fastest camera. This is a nice, down-to-earth explanation of what is actually a rather complicated thing, which strikes me as valuable.

    This might be why so many technical classes are taught so poorly in universities. The professor either doesn't care or doesn't take the time to relate the subject matter to actual experiences that the student may have had. I'm not talking about doing a lot of hand holding, but refer again to the subject of the Chem award. It's more useful to the uninitiated to say that you're taking a bunch of pictures of the reaction so that you can play it back in slow motion than to just barf "femtosecond chemistry" all over your frosh class at the first lecture. Starting out this way provides a natural progression into the actual mechanics of the laser imaging technique and the students will have a general idea of where they're going at the outset.

    IT folks are certainly no less guilty of this. I've gotten farther explaining network wiring saying "differential signal" than by simply spouting off with "UTP CAT5." Absolute vs. relative paths? Start with a zip code or area code analogy and your criminally ignorant users will be much happier.

  • In 1956, William Shockley, John Bardeen and Walter Houser Brattain won the Nobel Prize for Physics "for their researches on semiconductors and their discovery of the transistor effect". Are you all happy with that, after all anyone can build a computer but only a few could come up with the fundamental building blocks of one.

Keep up the good work! But please don't ask me to help.

Working...