1999 Nobel Science Prizes Announced 121
Andrew Childs writes "The 1999 Nobel Prizes in the sciences have been announced. The physics prize goes to Gerardus 't Hooft and Martinus Veltman "for elucidating the quantum structure of electroweak interactions in physics." The chemistry prize goes to Ahmed Zewail "for his studies of the transition states of chemical reactions using femtosecond spectroscopy." And the prize in physiology or medicine goes to Gunter Blobel "for the discovery that proteins have intrinsic signals that govern their transport and localization in the cell." "
Re:Gunter Gras wins literature prize (Score:1)
"Hope the legislators are embarassed..."
They would be, if they knew their shame. Not very
familiar with Oklahoma rednecks, are you? They
don't KNOW their level of ignorance, and have no
sense of irony. And I doubt you've written them
(as if they can read) to let them know there's something they should be ashamed of.
Awards... (Score:1)
Re:Where's computing? (Score:2)
On another note, one should keep in mind that when the Nobel prizes were first established, the things applied to a much larger scale. For instance, prizes were given to the study of the atom. With a better understanding of the atom, the fields of chemistry and biology were advanced. With advancements in biology, medicine progressed. Zewail's research could potentially allow for much progress to be made in biology.
One should remember the following when dealing with the sciences... "biology becomes chemistry, chemistry becomes physics, and physics becomes mathematics." And college student can tell you that...
Re:"I challenge you..." (Score:3)
book to a pre-1970's section and start reading.
It used to suck.
Did it? Sure, there were other things to worry about then, but I maintain that the human condition was in a much better state back then than it is now. Computers only seem to get in the way most of the time. Sure, you can make a small exception in the field of scientific research, but think about all the other things we've lost due to the advent of the personal computer.
People are less productive at work. (Studies have proven this again and again.)
The government finds it much easier to maintain huge databases of people and their habits. It's a simple matter nowadays to find out any information you want about anyone -- something that was out of reach for anyone in the past who couldn't afford a private inspector is now as simple as entering your credit card and waiting for a tidy report of your neighbors' dirty laundry to arrive in your electronic mailbox. We have less privacy than before.
Think of how much time you waste wrestling with your computer every day, every week, every month. Add it all up. Some day, you'll want that time back.
So science has grown us a healthier tomato plant and given us the ability to maintain erections well into our sixties, and computers are a part of that. Small contribution compared to the rather enormous chunks of life and liberty they've usurped from us.
Has your quality of life really improved?
- A.P.
--
"One World, one Web, one Program" - Microsoft promotional ad
Re:Where's computing? (Score:1)
Not only did you get your assumpions wrong (I can easily devise formal systems that are complete and yet quite useful), the complete assertion makes little sense in this context.
Don't forget the Igs ... (Score:1)
Have a closer look at the comittee's site (Score:1)
Also interesting is that today we can click to the homepages of the prize winners. Again look at the announcement page. Gerardus 't Hooft having the most interesting home page one of them.
Re:aww (Score:1)
Do they even have a Nobel Prize for *anything* involving computers?
The computing equivalent is the Turing award (Score:4)
The closest thing to a Nobel in computer science is the Turing Award [acm.org], given by the ACM.
For mathematics, the closest equivalent is the Fields medal [elib.zib.de].
Re:Gunter Gras wins literature prize (Score:1)
Physics is *not* mathematics (Score:1)
... (Score:1)
Re:Spelling (Score:1)
/peter
Re:Where's computing? (Score:1)
Re:Classic case of RTFM (Score:1)
Re:Nobel for Math "Error" (Score:1)
The high-energy/particle physics community in Europe is currently focusing on using CERN to find the Higgs boson (a.k.a. the "God Particle"). But, to do that (or at least to beat the Americans to it), they need more funding. Two European high-energy/particle physicists winning the Nobel is just the thing to get CERN more money.
------
The sad thing is... (Score:1)
Computing isn't the only thing "missing"... (Score:3)
I'm sure biographers have had a wonderful time guessing what influences in his life led him to favor those particular five fields.
In any case, Nobel himself specified it that way; you can't just add another prize for your favorite field. At best, you could try to establish another "memorial" prize, like the one for economics. This is probably good: if you could, everyone would be agitating for their favorite hero to get the coveted Nobel prize. And if they succeeded, then the prize wouldn't exactly be coveted any more...
Re:Some elucidation, I hope. (Score:1)
As a theoretical chemist, I must object. The interaction of two molecules is a consequence of electrostatics and the quantum mechanical properties of the electron (e.g., exchange and correlation, etc.)
To call these complex interaction "tactile" (while infinitely better than calling them optical) is just plain wrong.
As far as the hardware vs. software issue, I don't think you can apply those paradigms here. The protiens a piece of DNA codes for is a function of structure, and the actual enzyme is far more complex than the protien encoded for. For example, once assembled, a protien may go through several chemical changes before it becomes "activated" (e.g., picking up metal ions or other chemical moieties). So one could say that the protien has a program too.
Since one can call anything in a cell hardware or software to one degree or another, I don't think these analogies help to elucidate anything biological or physical in this case.
Not! - Renormalization recognizes our ignorance. (Score:1)
Renormalization is a very powerful technique that applies across a broad swath of physics, not just with elementary particles. The supposed "math error" is a recognition that the mathematical theories are incomplete. For example, in QED a calculation of the energy of an electron as a point charged particle gives you an infinite number - that can't be right of course because the electron does not have infinite energy or mass. An examination of where the infinity comes from reveals that it involves stuff at very small distances (and very high particle energies) - the renormalization technique is then to remove the infinity by putting in a high energy or short-distance cutoff in some physically consistent manner. Renormalization is really an augmentation of a mathematically "pretty" theory with some rules for getting practical calculations out of it - and the results are indeed very practical - QED has been verified in some instances to 1 part in 10^12 or so, better than any other basic physical theory we have.
The totally correct theory of course would not treat the electron (or photon) as a point particle but as some extended object (eg. a string). String theories in fact do resolve all these weird infinities, which is why they are so popular. Unfortunately, it's impossible to calculate just about anything practical with string theory (yet).
Some elucidation, I hope. (Score:2)
I think they're more hardware than software. Some proteins are enzymes, functioning in the regulation of vital chemical processes, others are structural, and that, to me, makes them more analogous to hardware.
Hope this helps.
If they must be software, I think proteins would be object code, while the genetic material (DNA for most of us) would be analogous to source code. Maybe all those introns (DNA segments that are copied in replication but never expressed) are comments, formatting, etc!
Then again, maybe it's all object code, with introns as the logic of the program, while extrons are the data. By this analogy, the extrons are the DNA code that gets transcribed (expressed) into proteins, while the introns dictate under what conditions to transcribe them. Loops and comparisons and subroutines, oh my!
I'd guess tactile, rather than optical. For one, unless one is working at extremely high (and thus damaging/dangerous to fragile DNA) frequencies, electromagnetic radiation (light) is inadequate to resolve the minute details of proteins. They're just too small. That's why electron microscopes and x-ray crystallography and the like are required instead of just visible-light microscopes.
Proteins interact with each other based on their shapes, and the attracting or repelling forces of their constituent parts. When they bind to each other, it works much like a lock and key (when they briefly mesh to facilitate some reaction, then break apart) or interlocking puzzle pieces (when they bind more permanently). Much more of a tactile than an optical event.
"I challenge you..." (Score:1)
It's already been done... Just open any history
book to a pre-1970's section and start reading.
It used to suck.
-WW
Re:"I challenge you..." (Score:1)
For you maybe, not for me.
"think about all the other things we've lost due to the advent of the personal computer. "
Now compare that to all that we've gained, and
you'll see a lopsided victory in favor of
computers. It's not even close.
Perhaps you're one of those that just would like
to get back to the "good old days"? I've heard
this scenario before... the grass is always greener
and all that.
"People are less productive at work."
You left out how computers have created entirely
new fields of work, while utterly replacing the
need for humans in others. It's give and take.
Sure, right now computers are a distraction in
SOME fields where they are not being used
correctly, or are overkill for the job at hand,
or the person is not responsible enough to handle
having a computer, or....... you get the point.
It's not the computer's fault that some people
mis-use it. I bet the same could be said for many
new inventions at the workplace.
"something that was out of reach for anyone in the past who couldn't afford a private inspector is now as simple as entering your credit card and waiting for a tidy report of your neighbors' dirty laundry to arrive in your electronic mailbox."
So you feel that only allowing the rich access to
information is a good thing? Funny, I thought the
notion of computers leveling the playing field was
a good thing!
"We have less privacy than before."
So what? Privacy is traded all the time for other
things. It's not "Here, take our privacy, and give
us nothing in return." There's a REASON you're
losing privacy.
And if you really want to, you can take it back.
Trade in your PC's, your radios, your CD players,
your microwave, your gameboy and N64, your TV,
and yes, even your car. Trade them all in, for
they all use computers, and go move out into the
wilderness, or some remote, foreign location.
Now you have your privacy... are YOU happy with that?
I wouldn't be.
"Small contribution compared to the rather
enormous chunks of life and liberty they've
usurped from us."
You're living in another reality. You should
write a book about it and stick it in the sci-fi
section. You might even make enough money to
afford that privacy you're after.
"Has your quality of life really improved?"
Are you reading this response in the comfort of
your own home or back yard, while sipping a fine
wine, smoking a cigar, lying on your back with the
blue skies above, and your kids playing in the pool?
No, but you could be, because of a little device
known as a FUCKING COMPUTER.
So, I guess my answer would be, "Yes."
-WW
-WW
Re:"I challenge you..." (Score:1)
And you have spent too much time studying history
that you can't see my
off a bumper sticker. Yes, we know correlation
does not equal causation.
Here's another one for ya. Maybe you can tell me
the history behind it:
"Jesus loves you.
Everyone else thinks you're an asshole."
-WW
Re:"I challenge you..." (Score:1)
I didn't say it had a PC, I said it had a
computer. A programmable electronic device that
performs prescribed operations on data (and
input/feedback) at high speed.
"by the way any car with a computer in it sucks!!!!"
Well, that would be any car made in the last
10-15 years or so... They're all controlled by
an on-board computer.
"radio no need for a computer there radios have only gotten smaller not better with transistor and circut technology, the marconi wireless could cross the atlantic and thats about as far as one
needs."
I should have qualified: modern, electronic radios. Of course radio waves themselves only need
power and amplification to be sent...
"whats the quality of life for those who have no
jobs or one which offers a paper hat instead of
self respect because they lost thier jobs to a
computer."
I'm not trying to answer if computers have been
good to EVERYONE ON EARTH. There are always people
who come out on the bad side of everything. That
does not mean computers haven't changed things for
the good.
Or do you miss the days before we had factories
that can create products faster than men by
themselves?
" in short if youre going to make an arguement
know what you are talking about and consider all
sides of the issue before spouting off thank you"
I have considered all sides of the argument. You,
however, are ignorant on many points here.
-WW
Re:Slashdot Prize (Score:1)
I think we should have a
Highest? Not a good idea as it benefits the very early subscribers who've had longer time to post. Instead I suggest something like signal to noise ratio which is the number of posts per week that is above the baseline. Then perhaps sort by volume. This way you would reward people who consistently get 4-5 using a few pithy and to the point comments but without continuously responding just to get their posts noticed and their karma up. For your information, in studies of organisational efficiency, people have always found simple metrics to be rather self-defeating as people orient to the artificial goal instead of the organisational objectives. Defining metrics for measuring people is inherently a politicised process. E.g. would you say that having a high IPO is a good measure of success in Silicon Valley? If so, does this encourage people to overhype technology giving the entire industry a bad name in the long term?
Oh well, he (or she) who defines the rules, gets the reward.
LL
Re:Where's computing? (Score:2)
No mathematics prize either, and I believe that subject existed then...
The mathematical equivalent is the Fields Medal [cs.unb.ca]. The story is that Nobel had a bit of a
As for other posters wanting something similar for computing, I would instead suggest that a computer language which is widely adopted and solves a significant class of problems would be a better choice. Afterwall, what is a language but a systematic way of ennunciating the concepts for a general problem domain? In this way, the greatest mark of respect for Perl and (to some extent) Python has been their rapid adoption by peer programmers. To paraphrase ESR, show them the code and reap the kudos.
LL
Can Theoretical Physics explain paranormal phenome (Score:3)
had two classes with this guy, so i am excited to finally see him get what he deserved a long time ago.
patrick.
Spelling (Score:1)
Re:Gunter Gras wins literature prize (Score:1)
Some overzealous law enforcement official nabed a guy claiming the film was child porn. The case was thrown out of court. Plus the guy got damanges for being run through the system Thats a long way from OK banning the film.
Granted, still pretty embarassing...
Re:Nobel for Math "Error".... Naaaaaah (Score:1)
Therefore A = B.
Yeah, but what about the fact that "infinity" might be "bigger" or "smaller" than the "other infinity"
Re:The computing equivalent is the Turing award (Score:1)
The legend says that Nobel wife/SO betrayed him with a matematician...
The reason why there's no Nobel award for CS is more obviouse... when the Nobel fund opened there was no CS.
Richard Feynman (Score:1)
Re:Bongo Artist Supreme (Score:1)
There's a new Feynman book just out, based on transcripts of his own words: The Pleasure of Finding Things Out [amazon.com].
As you say, one of the most fascinating, and also most brilliant minds of this century. If I was selecting a "great figures of history" dinner party, Feyman would definitely be at the head of the table!
Bongo Artist Supreme (Score:1)
If you haven't read it, take a look at Genius by James Gleick. An excellent supplement to Feynman's own writings. He's truly one of the most fascinating folks of the 20th century.
Re:Slashdot Prize (Score:1)
Nobel for Math "Error" (Score:2)
A/0 = B/0 {yes, that's dividing both sides by zero}
Therefore A = B.
Needless to say, this technique can, properly applied, solve ALL problems. Physicists are exceedingly unhappy about having to renormalize QED/QCD (which generated it's own set of Nobels for people like Feynman, Weinberg, Glashow, and Salam, among others) and it's widely felt this means the Standard Model is not the last word, even though it gives exquisitely accurate predictions that have been been subsequently borne out in the real world. Unfortunately, trying to apply the same techniques to a quantum theory of gravitation leads to infinities that cannot be renormalized (so far, anyway), so hopes of a TOE (theory of everything) are still nebulous. Nevertheless, a whole bunch of theoretical physicists are devoting their lives to that holy grail and its Nobel.
Re:Where's MATH? : Why (Score:1)
-
Re:Computing isn't the only thing "missing"... (Score:2)
Thats a myth. Entirely untrue, but for some reason it comes up EVERY time this subject is discussed.
-
Should computing have a Nobel? (Score:1)
But I doubt it will happen. Some of those networking advances use some pretty hefty math.
Re:The computing equivalent is the Turing award (Score:1)
-r
Re:Where's MATH? : Why (Score:1)
Video from press conference (Score:2)
Of course, I'll have to insert a Go Caltech! here (so I hadn't even heard of the place when he did his research in the late 80's... I'm here now.)
Re:Where's computing? (Score:1)
Re:Where's computing? (Score:1)
Gunter Gras wins literature prize (Score:2)
Hope the legislators are embarassed...
Re:It means we zap your racist brain with a laser (Score:1)
Re:The computing equivalent is the Turing award (Score:1)
So what? Nobel didn't establish the Economics prize either; it was started in 1968 to celebrate the 300th anniversary of the Swedish central bank. If there were enough call for a CS prize, they'd find a way to create one.
Classic case of RTFM (Score:3)
The answer is in the intent and purpose of the prize. Also, the nobel in Economics arrived recently, in the 1960s, and is titled
"The Sveriges Riksbank (Bank of Sweden) Prize in Economic Sciences in Memory of Alfred Nobel." It is in many ways different from the "original prizes".
Check out the history for the answers, and as to why there is no Nobel in mathematics.
http://www.lib.lsu.edu/sci/chem/guides/srs118_h
L.
Re:Where's computing? (Score:2)
Most that happens in this field isn't Nobel calibre stuff. We are in an evolutionary, not revolutionary field.
Anyway, certainly not Linus as Linux, though a revolution organizationally, is not particularly revolutionary technically. It is a very good OS, but at its root, it is merely a clone of something else. You don't get Nobels for copying someone else's work.
I'm not trying to diss anyone here. Linux is a great thing, but what is great about it is the method with which it was created, not the OS itself.
Re:That's exactly the point... (Score:1)
so that Nobel could be sure that when he died
he would be known as more that the dude that invented dynamite.
Just providing your daily dose of cynicism.
Re:Spelling (Score:1)
Re:aww (Score:1)
Re:PRESS RELEASE FROM AIP ON NOBEL PRIZE (Score:1)
The American Institute of Physics Bulletin of Physics News
Number 452 October 12, 1999 by Phillip F. Schewe and Ben Stein
THE 1999 NOBEL PRIZE FOR PHYSICS goes to Gerardus 't
Hooft of the University of Utrecht and Martinus Veltman, formerly
of the University of Michigan and now retired, for their work
toward deriving a unified framework for all the physical forces.
Their efforts, part of a tradition going back to the nineteenth
century, centers around the search for underlying similarities or
symmetries among disparate phenomena, and the formulation of
these relations in a complex but elegant mathematical language. A
past example would be James Clerk Maxwell's demonstration that
electricity and magnetism are two aspects of a single electro-
magnetic force.
Naturally this unification enterprise has met with various
obstacles along the way. In this century quantum mechanics was
combined with special relativity, resulting in quantum field theory.
This theory successfully explained many phenomena, such as how
particles could be created or annihilated or how unstable particles
decay, but it also seemed to predict, nonsensically, that the
likelihood for certain interactions could be infinitely large.
Richard Feynman, along with Julian Schwinger and Sin-Itiro
Tomonaga, tamed these infinities by redefining the mass and charge
of the electron in a process called renormalization. Their theory,
quantum electrodynamics (QED), is the most precise theory known,
and it serves as a prototype for other gauge theories (theories which
show how forces arise from underlying symmetries), such as the
electroweak theory, which assimilates the electromagnetic and weak
nuclear forces into a single model.
But the electroweak model too was vulnerable to infinities and
physicists were worried that the theory would be useless. Then 't
Hooft and Veltman overcame the difficulty (and the anxiety)
through a renormalization comparable to Feynman's. To draw out
the distinctiveness of Veltman's and 't Hooft's work further, one
can say that they succeeded in renormalizing a non-Abelian gauge
theory, whereas Feynman had renormalized an Abelian gauge theory
(quantum electrodynamics). What does this mean? A mathematical
function (such as the quantum field representing a particle's
whereabouts) is invariant under a transformation (such as a shift in
the phase of the field) if it remains the same after the transformation.
One can consider the effect of two such transformations, A and B.
An Abelian theory is one in which the effect of applying A and then
B is the same as applying B first and then A. A non-Abelian theory
is one in which the order for applying A and B does make a
difference. Getting the non-Abelian electroweak model to work was
a formidable theoretical problem.
An essential ingredient in this scheme was the existence of
another particle, the Higgs boson (named for Peter Higgs), whose
role (in a behind-the-scenes capacity) is to confer mass upon many
of the known particles. For example, interactions between the Higgs
boson and the various force-carrying particles result in the W and Z
bosons (carriers of the weak force) being massive (with masses of
80 and 91 GeV, respectively) but the photon (carrier of the
electromagnetic force) remaining massless.
With Veltman's and 't Hooft's theoretical machinery in hand,
physicists could more reliably estimate the masses of the W and Z,
as well as produce at least a crude guide as to the likely mass of the
top quark. (Mass estimates for exotic particles are of billion-dollar
importance if Congress, say, is trying to decide whether or not to
build an accelerator designed to discover that particle.) Happily,
the W, Z, and top quark were subsequently created and detected in
high energy collision experiments, and the Higgs boson is now itself
an important quarry at places like Fermilab's Tevatron and CERN's
Large Hadron Collider, under construction in Geneva.
(Recommended reading: 't Hooft, Scientific American, June
1980, excellent article on gauge theories in general; Veltman,
Scientific American, November 1986, Higgs bosons. More
information is available at the Swedish Academy website:
http://www.nobel.se/announcement-99/physics99.h
THE 1999 NOBEL PRIZE IN CHEMISTRY goes to Ahmed H.
Zewail of Caltech, for developing a technique that enables scientists
to watch the extremely rapid middle stages of a chemical reaction.
Relying on ultra-fast laser pulses, "femtosecond spectroscopy" can
provide snapshots far faster than any camera--it can capture the
motions of atoms within molecules in the time scale of
femtoseconds (10^-15 s).
An atom in a molecule typically performs a single vibration in
just 10-100 femtoseconds, so this technique is fast enough to discern
each and every step of any known chemical reaction. Shining pairs
of femtosecond laser pulses on molecules (the first to initiate a
reaction and the second to probe it) and studying what type of light
they absorb yields information on the atoms' positions within the
molecules at every step of a chemical reaction. With this technique,
Zewail and his colleagues first studied (in the late 1980s) a 200-
femtosecond disintegration of iodocyanide (ICN-->I+CN),
observing the precise moment at which a chemical bond between
iodine and carbon was about to break.
Since then, femtochemistry has revealed a whole new class of
intermediate chemical compounds that exist less than a trillionth of a
second between the beginning and end of a reaction. It has also
provided a way for controlling the courses of chemical reaction and
developing desirable new materials for electronics. It has provided
insights on the dissolving of liquids, corrosion and catalysis on
surfaces (see Physics Today, October 1999, p. 19); and the
molecular-level details of how chlorophyll molecules can efficiently
convert sunlight into useable energy for plants during the process of
photosynthesis. (Official announcement and further info at
http://www.nobel.se/announcement-99/chemistry99
Scientific American, December 1990.)
protein signals? (Score:1)
Does anyone informed care to "illucidate"?
Re:Where's computing? (Score:1)
Are you seriously telling me that life would be the same, or better, without computing?
I challenge you to describe the world without computing.
Peter Pawlowski
Re:Where's computing? (Score:1)
Peter Pawlowski
That's exactly the point... (Score:1)
Peter Pawlowski
Linus? Why? (Score:1)
As for your statement about computing being evolutionary, all Science is. You can't get Quarks without Atoms. You can't get Mitochondrea without cells. You can't multiplay before adding. All Science is evolutionary. However, I still believe that Computer Science is a field that deserves a stand-alone Nobel. Not much we can do about it, though.
Peter Pawlowski
Where's computing? (Score:2)
I say yes.
Guess who I'd nominate first?
Peter Pawlowski
Re:aww (Score:1)
Just the application of computers in other Nobel fields. If somebody uses Linux to bring about world peace, they might send Linus to Stockholm.
Re:aww (Score:1)
r
Slashdot Prize (Score:1)
Re:Slashdot Prize (Score:1)
Congrats to Zewail (Score:1)
-- Moondog
Re:"I challenge you..." (Score:1)
Re:"I challenge you..." (Score:1)
Re:aww (Score:1)
The vendor was doing his job; with this logic, you could also say Bill Gates should win the Nobel prize due to the fact "that average users can now install and run what was once a nightmare." Still is a nightmare, IMHO, but Linux isn't out of the woods yet.
Anyway, Neither Linus nor Mandrake warrants the Nobel Peace Prize, that's not what it is there for, but if anything, Linus is the father and central developer/focal point of the Linux effort, obviously. Perhaps more fitting would be the founders of the open source movement. GNU Fathers, anyone?
In any case, Nobel is more scientific theory affecting our interactions and undersanding of the world, not of computers.
Computers are but a small part of life, my friends.
(So there!)
Re:Where's computing? (Score:1)
a Nobel Prise for CS . . ummm lemme guess..
could it be ummmm . .
parhaps the legendary . .
the open source revolutionary . .
one and only . .
the Jimmy Kimal to my Ben Stein. .
the master of disaster . . .
king o' crackers . .
that freaky phreaker . .
a real geeks geek . .
the churnin hunk o' burnin' funk . .
better than bevis. . .
doper than dilbert . .
the king of all coders . .
the big man! . .
Malda!?
ummm yea, ok...he doesn't suck ;)
Re:Where's Richie? (Score:1)
Then Dennis Richie is the obvious choice. He invented C and used C to invent UNIX
nuff said?
Re:Nobel for Math "Error".... Naaaaaah (Score:1)
not:
A/0 = B/0
but:
A + infinity = B + another infinity
Therefore A = B.
Just as tricky btw, but certainly not applicable to every other problem.
Ivo
Truly News for Nerds (Score:3)
Announcer: "And the Nobel for Physics goes to...."
(Cut to shot of nervous hopefuls)
Announcer: "Gerardus 't Hooft and Martinus Veltman, for elucidating the quantum structure of electroweak interactions in physics!"
(The duo look surprised and go up to the stage to claim their prize. The audience cheers happily, though those who were not nominees have no idea what the announcer said)
Veltman: "What can I say? This is indeed a proud moment for us both. We'd like to thank everyone in our lab for helping us. And our families, for supporting us morally."
Hooft: "And Elvis."
(Veltman just gives Hooft an odd look)
Announcer: "And there you have it. Coming up next, the nobel prize for Chemistry! Right after this word from our sponsor...."
Re:aww (Score:1)
(See http://www.nobel.se/awarding/ [nobel.se])
Computer programming-biology metaphor (Score:1)
I guess you could look at it this way:
DNA is like source code RNA are like the object files proteins are the compiled binaries.
You could think of these signals as--I guess--equivalent to magic numbers in UNIX?
The hardware/software question is kind of hard to tackle with this analogy, though--it kind of breaks down. I guess in an everyday sense, they are hardware--they are physical entities that move around and sometimes have structural functions and they function by physically interacting with other substances. But I think it's more fruitful to make them analogous to objects, in an OOP sense.
Apparently the idea of signal sequences has been around for about 20 years.
Re:Computer programming-biology metaphor (Score:1)
I don't know if it would really be that great if computers were just like life. When you consider it took about a billion years to get a basic cell, and another three billion to get the first multicellular organism--that's one hell of a wait for the next upgrade. Though I admit that we haven't made anything near the complexity of a single bacterium (but then again we haven't had a billion years!) Plus the cruft factor is incredible. The human genome is roughly 3 Gbp--but only about 120 Mbp actually codes for anything! The rest is evolutionary debris or just outright junk (except possibly for structural purposes, I guess) That would be kind of annoying, filling up a 3 GB hard drive, when all you really need is that 120 MB. Guess it's kind of like running Windows.
Re:Translator (Score:1)
Re:Gunter Gras wins literature prize (Score:1)
(1) they remember, and
(2) they pay attention to who gets the Nobel Prize.
I think (1 AND 2) is pretty unlikely.
It means we zap your racist brain with a laser to (Score:1)
Re:"I challenge you..." (Score:1)
Re:Bongo Artist Supreme (Score:1)
The Holographic Hypothesis by 't Hooft (Score:1)
Re:Computing isn't the only thing "missing"... (Score:1)
Anyway mathematics and computer science have their own well established prizes, so there isn't really a need for Nobels in those areas. Although of course, no other prizes come close to the Nobels in terms of public prestige -- the Nobels are probably the only acamdemic prizes that most of the public can even name you.
Ooops (Score:1)
So, not 't Hooft... sorry.
Renormalisation (Score:2)
A/0 = B/0 {yes, that's dividing both sides by zero}
Therefore A = B.
Or rather, a + A/0 = b + B/0, so a=b (Feynman, Schwinger, Tomonaga, 1947).
But you could ask whether this is really so much worse than A.0 = k B.0 (Newton, Leibniz c.1680).
In both cases, once you have found the right way to show that the cancellations work for all finite values, the limit starts to look plausible (and you can start isolating just what situations would break it). 't Hooft gave the fundamental proof that all gauge theories genuinely are renormalisable (including the electroweak theory and QCD).
The important thing about renormalisation is that the problem isn't with the interactions, it's that the set of basis functions that you're using to expand space are getting more and more nearly orthogonal to reality. That means you end up with something rather like a very very ill-conditioned matrix to invert. (If you like, this is the O^(-1) (B-A)). You don't have to expand in such a bad basis, though. 't Hooft was able to show that if certain symmetry properties hold then all the nasties cancel, whatever the basis expansion.
So it probably isn't still true that all physicists are "exceedingly unhappy" about renormalisation; but a better sequence of basis functions for thinking about small-scale reality would certainly be nice.
't Hooft also found a quite unexpected mathematical gauge symmetry between bosons and fermions, which probably deserves a Nobel prize by itself. We still don't know whether the laws of nature have this supersymmetry or not, but the idea has fascinated theoretical physicists for 25 years, and is built in to superstrings in their very foundations.
A thoroughly worthy winner.
Re:protein signals? (Score:1)
Biosignals, unlike their computing equivalents, are not arbitrary. Instead they tend to be particular sequences of water-attracting and water-repelling amino acids which make certain things easier. The interactions are not optical, or even analog. They are physical in nature, making them extremely versitile, but also very complicated.
Re:Where's computing? (Score:1)
Use the mirrors! (Score:3)
Re:Translator (Score:1)
Actually, this is a good point. Far too many of us geek-types (by which I naturally mean without prejudice scientists and engineers) try to either justify our existence or prove our worth by throwing around jargon even in a general setting. "Intelligence by vocabularly," you might say.
It's certainly true that there are concepts and procedures which are too complex to explain in layman's terms, but it's inexcusable not to make an effort to allow people without a PhD to understand your work.
Take the case of the Chemistry Nobel that Zewail won; the official press release describes his work as analagous to creating slow-motion film of a chemical reaction using the world's fastest camera. This is a nice, down-to-earth explanation of what is actually a rather complicated thing, which strikes me as valuable.
This might be why so many technical classes are taught so poorly in universities. The professor either doesn't care or doesn't take the time to relate the subject matter to actual experiences that the student may have had. I'm not talking about doing a lot of hand holding, but refer again to the subject of the Chem award. It's more useful to the uninitiated to say that you're taking a bunch of pictures of the reaction so that you can play it back in slow motion than to just barf "femtosecond chemistry" all over your frosh class at the first lecture. Starting out this way provides a natural progression into the actual mechanics of the laser imaging technique and the students will have a general idea of where they're going at the outset.
IT folks are certainly no less guilty of this. I've gotten farther explaining network wiring saying "differential signal" than by simply spouting off with "UTP CAT5." Absolute vs. relative paths? Start with a zip code or area code analogy and your criminally ignorant users will be much happier.
Re:Where's computing? (Score:1)