Forgot your password?
typodupeerror
Supercomputing Science Technology

Optical Computer Made From Frozen Light 441

Posted by Zonk
from the now-they-need-to-work-on-sabers dept.
neutron_p writes "Scientists at Harvard University have shown how ultra-cold atoms can be used to freeze and control light to form the "core" - or central processing unit - of an optical computer. Optical computers would transport information ten times faster than traditional electronic devices, smashing the intrinsic speed limit of silicon technology. This new research could be a major breakthrough in the quest to create super-fast computers that use light instead of electrons to process information. Professor Lene Hau is one of the world's foremost authorities on "slow light". Her research group became famous for slowing down light, which normally travels at 186,000 miles per second, to less than the speed of a bicycle."
This discussion has been archived. No new comments can be posted.

Optical Computer Made From Frozen Light

Comments Filter:
  • by SIGALRM (784769) * on Thursday April 14, 2005 @09:45AM (#12233662) Journal
    ultra-cold atoms can be used to freeze and control light
    Crap, and I just bought a new water-cooled chassis with 6 fans and alot of cool neon light tubes...

    Where do I get one of these? No, I want it now :)
  • by Anonymous Coward on Thursday April 14, 2005 @09:46AM (#12233676)
    ...you will void your warranty and may suffer a severe sunburn.
  • I am a skeptic (Score:5, Insightful)

    by Flywheels of Fire (836557) on Thursday April 14, 2005 @09:46AM (#12233679) Homepage

    Most of the positive fanatics [mithuro.com] write lots of papers; those who think it's not going anywhere (like me) don't. There are sound physical reasons to be skeptical, in my mind:

    1) Wavelengths are too big: 1 micron is now a large number, and optics doesn't work much smaller than this.

    2) There are no good nonlinearities. Anyone can make a linear OR gate optically, but to function as an effective digital technology you need nonlinearity and level restoration. This is missing in pure optical systems, except at very high power levels. The high power levels imply low density. There are some optical gates which process data in "femtoseconds," but ask them how long it takes to get to the next gate. Maybe someday someone will invent a great, low power, fast, optically nonlinear material. Don't invest in it yet.

    3) The serious workers are now mostly working in combined electronic/optical modes. The speeds here are limited by the gate speeds of the electronics, just like normal computers. You have to then ask if optics is a good (cost effective, space efficient, low power...) replacement for wire. Ultimately, the answer is probably yes, but there's an awful lot of work to do before that's true (for the distances of a few centimeters in high density computers, that is).

    • Personally, I'm very happy that there are people out there without your rigidly defined definitions of what's impossible and what's not.

      While this may not work (and I emphasize may, isn't it just a wee bit early to pronounce it impossible, implausible, or impractical?

    • Re:I am a skeptic (Score:5, Insightful)

      by OneOver137 (674481) on Thursday April 14, 2005 @10:11AM (#12233956) Journal
      1) Wavelengths are too big: 1 micron is now a large number, and optics doesn't work much smaller than this.

      Please clarify what you mean here. 1 micron is in the IR, and optical laws work just fine down to fractions of an Angstrom as in Bragg diffraction and scattering of solids.
    • Re:I am a skeptic (Score:5, Informative)

      by karvind (833059) <.moc.liamg. .ta. .dnivrak.> on Thursday April 14, 2005 @10:13AM (#12233974) Journal
      Wavelengths are too big: 1 micron is now a large number, and optics doesn't work much smaller than this

      I am not sure what you meant by this. Modern photolithography (used in production) has optics which works well at the 193nm wavelength. EUV which is lot more complicated has optics which works all the way to 13nm wavelength.

      The speeds here are limited by the gate speeds of the electronics, just like normal computers.

      I think you meant interconnect delay and not switching speed of a transistor. State of the art and next generation transistors can switch in a fraction of a picosecond. On the other hand interconnects don't scale well and are the bottleneck.

      Optical interconnects can break even for clock distribution were skew & crosstalk are important and the network has lot of capacitive load. That, in my opinion, will be the first place where optics will enter into microprocessors.

      • Re:I am a skeptic (Score:5, Interesting)

        by Idarubicin (579475) <allsquiet@hotmai[ ]om ['l.c' in gap]> on Thursday April 14, 2005 @11:28AM (#12234826) Journal
        I am not sure what you meant by this. Modern photolithography (used in production) has optics which works well at the 193nm wavelength. EUV which is lot more complicated has optics which works all the way to 13nm wavelength.

        While those statements are true, I'm not sure if it's really legitimate to say that those wavelengths will work well inside a computational device.

        Calling 13nm 'extreme ultraviolet' is marketing--those are really soft x-rays at that point. You're getting into photons that are inconveniently energetic. That's fine if you're doing lithographic etching of chips, but murderous on your hardware in daily operation.

        We also don't have light sources capable of anywhere near the appropriate level of miniaturization for those very short wavelengths. Constructing one large EUV source for a chip fab plant is a very different engineering problem from constructing hundreds, thousands, or millions of such sources on each chip. The optics also get much more complex, expensive, and exotic as you move to shorter wavelengths. Once again, things that can be done in a billion-dollar chip fab are quite different from things that can be done on a hundred-dollar microchip.

    • Re:I am a skeptic (Score:5, Insightful)

      by wwest4 (183559) on Thursday April 14, 2005 @10:18AM (#12234020)
      > those who think it's not going anywhere (like me) don't [write papers].
      > There are sound physical reasons to be skeptical, in my mind:

      No disrespect intended, but... having doubts is a lousy reason to be discouraged from research into this, or any, field. The reality is exactly the reverse: skepticism is a really good motivation to go and validate your assertions, instead of just keeping them unproven in your mind.

    • 2) There are no good nonlinearities... OR gate optically .... blahblah... Maybe someday someone will invent a great, low power, fast, optically nonlinear material. Don't invest in it yet.

      Well, the peopl working in this are, IMO, shortsighted. Who says light has to travel thru (optical) wires or artificial gates? Light can be transmitted in 2D, in parallel, with no interference (unless you're talking holography). We can use that to our favor.

      There have been experiments in image recognition using light. I
    • Re:I am a skeptic (Score:5, Informative)

      by Anonymous Coward on Thursday April 14, 2005 @10:48AM (#12234395)
      1) Wavelengths are too big: 1 micron is now a large number, and optics doesn't work much smaller than this.

      Not precisely correct. Most of the optical switches that Intel was developing back in 1999-2000 used evanescent modes to propagate along phosphorous-doped silicon waveguides with widths in the .3 um range. Result: you can move the light around in smaller pipes, but the evanescent modes decay quickly, on the order of centimeters.

      2) There are no good nonlinearities. Anyone can make a linear OR gate optically, but to function as an effective digital technology you need nonlinearity and level restoration. This is missing in pure optical systems, except at very high power levels. The high power levels imply low density. There are some optical gates which process data in "femtoseconds," but ask them how long it takes to get to the next gate. Maybe someday someone will invent a great, low power, fast, optically nonlinear material. Don't invest in it yet.

      Can you expand on this a bit? I'm confused as to how releveling implies high powers. Are you saying that the need for additional power input in order to improve the eye is prohibitive? Are you talking power input or optical power density?

      Secondly, the gate region of a MOSFET (if doped appropriately to make the energy levels right) is an optically nonlinear material that makes a great switch. By setting the appropriate bias levels statically, one can change an optical OR gate into an AND gate into a NAND gate on the fly. While the switching rate is in tens of gigahertz, the reconfiguration rate is much slower, in the megahertz range, because you have to bleed off the common-mode biasing caps with another circuit and this takes longer. You can even have a buffered feedback circuit that does dynamic pre-emphasis over a few bits at a time. So what you have is an electrically biased and reconfigurable switch where the data path is all optical.

      3) The serious workers are now mostly working in combined electronic/optical modes. The speeds here are limited by the gate speeds of the electronics, just like normal computers. You have to then ask if optics is a good (cost effective, space efficient, low power...) replacement for wire. Ultimately, the answer is probably yes, but there's an awful lot of work to do before that's true (for the distances of a few centimeters in high density computers, that is).

      I agree. I'll bet that Intel's trying to perfect on-chip semiconductor lasers fabricated in their existing process. I'm pretty sure that they've nailed optical recievers in their process already.
      • Re:I am a skeptic (Score:3, Interesting)

        by drmerope (771119)
        It is a common misconception that transistors are like switches. That explanation misses the point entirely. In digital circuits transistors are used as amplifers. Traditional computers work by charging and discharging capacitors.

        Parent's parent's point about high-energy is that if your signal is strong enough to begin with, you might be able to finish the computation without amplifying it. In practice, this does not happen. Google "pass-gate" logic to learn how to use transistors as switches and how
    • Photon size problem (Score:5, Interesting)

      by Laaserboy (823319) on Thursday April 14, 2005 @11:03AM (#12234534)
      1) Wavelengths are too big: 1 micron is now a large number, and optics doesn't work much smaller than this.

      This poster is correct. Since I have a Ph.D. in the field and the parent obviously knows something about optics, I might as well respond to the parent's critics.

      IR photons are BIG. Forcing light to bend around corners is difficult. A waveguide must have a very high index of refraction if it is to be used to bend light within a reasonable radius. To the extent a Bose-Einstein Condensate helps this problem is encouraging if you don't mind cooling your computer to 2 millikelvin.

      The speed of these optical computers always seems to come down to limitations of the silicon processors that work in conjunction with the light.

      It's just a Bose-Einstein Condensate. These projects take time. While we are enamored with this BEC project, some poor grad student is working on carbon doping. Higher doping might improve the world of electronics far more than another optical computer claim.

      I visited Hau's website and did, though, enjoy her papers [harvard.edu]. I just don't think the press release accurately portrays the low engineering potential of this work.
  • by Hulkster (722642) on Thursday April 14, 2005 @09:47AM (#12233686) Homepage
    I guess all those guys using liquid water cooling (and even the folks using liquid Nitrogen) just got one-upped ... will we start seeing benchmarks using liquid Helium cooling?

    BTW, for those interested, here's a direct link to the "Light at Bicycle Speed ... and Slower Yet!" presentation [harvard.edu] - I was travelling about that speed in my coldest car during a Colorado snowstorm. [komar.org]

  • by Leontes (653331) on Thursday April 14, 2005 @09:48AM (#12233688)
    e=mc^2 except where c is like slower and fuck, headache.
    • In other news, electricity is being generated from Albert Einstein's coffin as he spins in his grave...
    • by FhnuZoag (875558) on Thursday April 14, 2005 @10:19AM (#12234034)
      It's not the same.

      The c in E=mc^2 (or E^2 = M^2c^4 + p^2c^2) refers to an intrinsic property of spacetime. Bose Einstein Condensates and so on don't really alter that. One way to think about it is to stop with the 'slowing down light thing', and instead conceive it as the BEC swallowing up photons for a while, storing the information, and then reconstructing a new photon which is exactly identical at the end. This is pretty much the same, because in QM, you can't really track anything exactly, and you definitely can't distinguish between objects with the same properties.
      • Yo man it be DJ Doomday, fresh from busting phat rhymes with my homie MC Hawking. I fin to give an explaination uh de momma pos fuh my homies Sheeit!

        Yo buss dis. It's not de same. De c in E=mc^2 (or E^2 = M^2c^4 + p^2c^2) refers to an intrinsic property uh spacetime. Bose Einstein Condensates an' so on ain't really altuh dat. One way to think 'boutit be to stop wit de 'slowin down light thin', an' instead conceive it as de BEC swallowin up photons fuh a while, storin de information, an' den reconskructin
  • by Doc Ruby (173196) on Thursday April 14, 2005 @09:48AM (#12233693) Homepage Journal
    Imagine trying to harness today's 3GHz CPUs with 1930s lab bench equipment. Digital electronics could have seemed another universe, out of reach in a universe of alternate physics "beyond radio". If photonic computation is within reach at artifically lowered speeds, we might be just about to cross the watershed, like going from transistor to ENIAC.
  • by Anonymous Coward on Thursday April 14, 2005 @09:48AM (#12233696)
    Her research group became famous for slowing down light, which normally travels at 186,000 miles per second, to less than the speed of a bicycle."

    Ah, so she worked on IE.

  • by buddhahat (410161) on Thursday April 14, 2005 @09:48AM (#12233697) Homepage
    became famous for slowing down light, which normally travels at 186,000 miles per second, to less than the speed of a bicycle.

    ah yes, the Speed of a Bicycle (SoaB) metric for slow light.
  • by Anonymous Coward on Thursday April 14, 2005 @09:49AM (#12233700)
    The best thing about frozen light is that you can put it in your freezer, so that when there's a blackout, it will thaw and then you'll have light.
  • errrmmmm... (Score:3, Interesting)

    by shades66 (571498) on Thursday April 14, 2005 @09:49AM (#12233702)
    >to less than the speed of a bicycle.

    So is that
    1) A Bicycle with a jet engine strapped to it?
    2) A Bicycle going up a hill with an 80 year old man on it?
    3) A Bicycle being dropped off a building/cliff
    4) A Bicycle being raced?
    5) other?

    • by magarity (164372)
      We can safely rule out 'A' since velocity can't be negative; any bicycle with just a bare jet engine strapped to it ain't goin' nowhere.
    • The speed of a bicycle going about 12 mph, IIRC. I remember reading about the experiement in high school, although SoB's weren't used as a unit of velocity measurement back then.

      Also the speed of light is 3E8 km/s in a vacuum. It travels slower through matter. The denser the matter, the slower the speed of light. In that experiment, light was shined through a supercooled gel, and took a length of time to travel across so great that it meant light had traveled at a velocity of ~12 mph

    • by Anonymous Coward
      6) A bicycle hurled through space at nearly the speed of light?
  • ultra-cold atoms

    crap... what kind of a cooling system will this require?

    hm.. i wonder what frozen light looks like... well, i suppose you can't see it.

  • by brontus3927 (865730) <edwardra3@g m a il.com> on Thursday April 14, 2005 @09:51AM (#12233737) Homepage Journal
    And this means absolutely nothing to the non-supercomputer world. Light doesn't slow itself down for free. Freezing light for this proccess likely takes the expenditure equal to the GDP of a small country. At best, in the next 50 years there will be 2 frozen light optiocal supercomputers
  • I propose that "speed of a bicycle" be adopted as the standard measure of velocity in technical articles. Units already included in the standard are "Libraries of Congress" for data storage requirements and "Size of a Volkswagon" for physical size measurements.
  • ...the light freezes you!

    Will it at least make and keep my vodka cold, comrade?
  • Awesome (Score:4, Funny)

    by back@slash (176564) on Thursday April 14, 2005 @09:57AM (#12233807)
    Now all we need is Advanced Military Algoritms and Pre-Sentient Algorithms until we achieve Fusion Power and our units become twice as strong as our enemy's units.

    Intellectual Integrity and Cyberethics may pose a problem however.
  • Telecosmic (Score:2, Funny)

    by glenrm (640773)
    And we will all be overrun with Telecosmic cathedrals of light, blah, blah, blah...
  • Does this mean that I'll be able to go buy a lightsicle soon?
  • by Datamonstar (845886) on Thursday April 14, 2005 @10:02AM (#12233868)
    I'll finally get that lightsaber I've been wanting?
  • by stratjakt (596332) on Thursday April 14, 2005 @10:03AM (#12233873) Journal
    Obviously it's not simply a temperature thing, since most of space is absolute zero, and I can see stars and suns and stuff. So it's not freezing light as in freezing water.

    So how exactly do you stop photons from moving? How does this affect relativity (e=mc^2)? How does this affect our perception of the universe - ie; if the light from the star that we think is 10,000 light years away is only moving 20mph or so, it could really be millions of light years away?

    Does like, time slow down? My heads spinning. Freeze sounds like the wrong word.
    • by aBrownCow (836216) on Thursday April 14, 2005 @10:23AM (#12234075)
      From Wikipedia: 'In a sense, any light travelling through a medium other than a vacuum travels below c as a result of refraction. However, certain materials have an exceptionally high refractive index: in particular, the optical density of a Bose-Einstein condensate can be very high. In 1999, a team of scientists led by Lene Hau were able to slow the speed of a light beam to about 17 metres per second, and, in 2001, they were able to momentarily stop a beam.' Slowing light down is nothing new, it happens every time light travels through a medium other than the vacuum of space. Atmosphere, glass window, diamond, etc. It just so happens that we can now create in a laboratory these BEC's, a so-called "superfluid" which is basically a substance cooled to the point where nearly every atom collapses to the lowest quantum state (like, close to absolute zero). This gives it some interesting properties, like zero viscosity and an extremely high optical density. Hope that helps.
    • 1) Space isn't absolute zero. It hovers around 3 kelvins (three degrees Celsius above absolute zero)

      2)Really weird phyics like this doesn't start happening until things get really cold. Think tenths or hundredths of a degree above absolute zero. Of course, since energy and temperature are related concepts, at absolute zero, there is no energy, and nothing moves.

      3)Relativity is still in effect. In fact it makes a lot of sense here. Less temperature = less energy (e). the speed of light (c) decreases

    • Real empty space (if such a thing exists) doesn't have a temperature. Temperature is about how much random kinetic energy something has, and nothing has no energy. (Actually wrong, because of virtual particles and the like, but let's just ignore this for now.)

      To freeze light, you reduce the temperature of the medium it travels in. When this gets really, really cold, because of quantum uncertainty, the whole lot stops acting like normal atoms at all, but as a single, big ball of stuff, following a set of ma
  • Speed of light (Score:3, Insightful)

    by dreadknought (324674) on Thursday April 14, 2005 @10:03AM (#12233879)
    The speed of light is _only_ 186,000 mi/sec when traveling through a vacuum. Light travels at slower speeds through all other mediums (i.e. earth's atmosphere, glass, a super-cooled diamond, etc)
  • at least we know we won't have to worry about cooling anything down in our computers. In fact, given the temps in those "frozen atoms" we may need heaters for the room in which that thing is sitting... Not to mention that you can't have any more plexiglas on your tower, or you'll probably lose all your processing power in tanning power!
    Damn, geeks, you're out of luck...
  • This is the first Physorg article that I've seen listed in /. that actually provides an offsite link for the story! Are they getting mellow, or did they just make a mistake and will go back to their usual "tarpit" methods?
  • wasn't photon computing's purpose to use the speed of light to do computations? What use is to have light for the processing, if it's slower than the electrons we currently use?

    And with all this freezer stuff, I doubt it'll have any practical use except for one or two super-secret govt computers that need millions of dollars in budget to do some crypto-crunching stuff.
  • Is to shrink down those huge coolers to fit into this laptop, and now I will have to deal with freazing lap instead of burning. Shit.
  • So these guys [litespeed.com] aren't exaggerating?
  • Ironically the problem with optical computers using this method is the same as that of silicon based systems - cooling!
  • by DumbSwede (521261) <slashdotbin@hotmail.com> on Thursday April 14, 2005 @10:24AM (#12234083) Homepage Journal
    OK, BSEs are neat and all. Good science and good physics, but just because one can be used to trap the phase and amplitude of a wave front of light for some time is a HUGE stretch to call it a computer.
    The title of this post clearly reads:
    Science: Optical Computer Made From Frozen Light

    We don't even have a diagram for a logic gate (or at least none are presented in the article) just some supposition in the article that such a thing could be used as a component. As for the 10x faster, where the hell did this number come from? Even if Moore's Law is slowing down (don't nit pick about it be about the number of components on a chip) it will make this "smashing" 10x advantage moot. Perhaps they refer to the speed of light in free space as opposed to signal speed copper. But even this doesn't make sense because signal speed in copper is about c/3.

    What really maters is how fast a gate can be made to switch, how easy it is to fabricate enough of them to do something useful, and how close you can pack them together. Until someone can put down on paper the diagram of how this thing would work it is pointless to posit that it would be 10x faster.

    Usually for these Pie-in-the-Sky type hype offerings it is common to claim 100x or 1000x or 1,000,000x times.

    That BSEs might be used someday as parts in a Quantum computer would be a completely different thing, and those calculations that could be done quantumly would be trillions of times faster, but only for very specific algorithms. This article is not talking about that possibility, but classical computing and I think they have a lot of work to do just to demonstrate a single working component. Let alone claim BSE computers are here or just around the corner.

  • Phasers?!?! (Score:3, Funny)

    by Kr3m3Puff (413047) * <(me) (at) (kitsonkelly.com)> on Thursday April 14, 2005 @10:27AM (#12234130) Homepage Journal
    Does this mean we can actually make phasers that produce slow photons so we can have cool special effects in real life like Star Wars and Star Trek? Then our super heros can dodge lasers.

    I am sure this will be the next product on Think Geek.

  • Fast microchips are all fine and dandy, but they're not going to satisfy my appetite for frozen light.
  • Defining light? (Score:2, Interesting)

    by ebvwfbw (864834)
    Light is known to behave as both a particle and a magnetic wave, like a radio wave. Maybe light isn't a radio wave at all, it is a different critter.

    There again she could be showing us smoke and mirrors. This is light after all. I'm still on the skeptical side.

  • in the quest to create super-fast computers ... Her research group became famous for slowing down light

    It's things like this that enlighten me as to why there aren't more women in science.

    Female Genius: "I have this theory that we can create super-fast computers by slowing down light!"
    Old, Bald Male Faulty Head: "Stupid woman..."


    D.

  • Hopefully these ultra-cold atoms won't cost as much as those teeny-tiny atoms. Have you seen the price of those lately?
  • So if your computer has a "meltdown" when the power goes out and the fridge goes warm, do you have to buy a new CPU? Carry it home in a cooler full of dry ice from CompUSA?

    Hmmm new Outlook virus turns off ACPI, melts thousands of CPUs.

  • African or European?
  • Speed of a bicycle (Score:3, Interesting)

    by Criffer (842645) on Thursday April 14, 2005 @10:50AM (#12234405)
    If your measurement for the speed of light is comparing it to the speed of a bicycle, how do you know that the light has slowed, and its not just the bicycle has been superaccelerated (being ridden really really really fast).

    Einstein showed there is no o bjective measure of speed. Of course, if a bicycle were to travel at the speed of light, it would be very heavy and very long, but, if you were the one riding it, you wouldn't notice...

  • by Dysson (457249) on Thursday April 14, 2005 @12:12PM (#12235468)
    Kudos to whoever is giving out low mod points to people whose jokes completely blow. I have seen "Funny,5" way too many times for observations that are just too painfully unfunny to read.

    >In Soviet Russia, light freezes you!!

    God, please stop.

If builders built buildings the way programmers wrote programs, then the first woodpecker to come along would destroy civilization.

Working...