Optical Computer Made From Frozen Light 441
neutron_p writes "Scientists at Harvard University have shown how ultra-cold atoms can be used to freeze and control light to form the "core" - or central processing unit - of an optical computer. Optical computers would transport information ten times faster than traditional electronic devices, smashing the intrinsic speed limit of silicon technology. This new research could be a major breakthrough in the quest to create super-fast computers that use light instead of electrons to process information. Professor Lene Hau is one of the world's foremost authorities on "slow light". Her research group became famous for slowing down light, which normally travels at 186,000 miles per second, to less than the speed of a bicycle."
Quick Reflection on a Slow Mirror (Score:5, Interesting)
errrmmmm... (Score:3, Interesting)
So is that
1) A Bicycle with a jet engine strapped to it?
2) A Bicycle going up a hill with an 80 year old man on it?
3) A Bicycle being dropped off a building/cliff
4) A Bicycle being raced?
5) other?
Can a physics geek explain how you "freeze" light? (Score:5, Interesting)
So how exactly do you stop photons from moving? How does this affect relativity (e=mc^2)? How does this affect our perception of the universe - ie; if the light from the star that we think is 10,000 light years away is only moving 20mph or so, it could really be millions of light years away?
Does like, time slow down? My heads spinning. Freeze sounds like the wrong word.
Changing world of Physics (Score:2, Interesting)
The speed of light is now known to be controllable. One major university laboratory recently was able to actually stop light from moving. That kind of blows the constant out of the water. Kind of makes the statement that I can't travel faster than the speed of light mute too. Einstien had it right though, it's all relative (in very simple terms). We also now know for a fact that instantanious travel is physically possible via quantum entanglement, across any distance. Proven in a lab. Even more hard to grasp concepts have even been proven recently, such as the concept of a single object existing in two different places at the same time. Also proven in a lab. All of these have corresponding articles on Slashdot and are easily tracked down, so I won't waste my time providing the links. The next couple of decades had ought to be pretty exciting for those that pursue new physics in these areas.
"The world is not what it seems, but is what it is. ~ Brian King"
Defining light? (Score:2, Interesting)
There again she could be showing us smoke and mirrors. This is light after all. I'm still on the skeptical side.
Speed of a bicycle (Score:3, Interesting)
Einstein showed there is no o bjective measure of speed. Of course, if a bicycle were to travel at the speed of light, it would be very heavy and very long, but, if you were the one riding it, you wouldn't notice...
Photon size problem (Score:5, Interesting)
This poster is correct. Since I have a Ph.D. in the field and the parent obviously knows something about optics, I might as well respond to the parent's critics.
IR photons are BIG. Forcing light to bend around corners is difficult. A waveguide must have a very high index of refraction if it is to be used to bend light within a reasonable radius. To the extent a Bose-Einstein Condensate helps this problem is encouraging if you don't mind cooling your computer to 2 millikelvin.
The speed of these optical computers always seems to come down to limitations of the silicon processors that work in conjunction with the light.
It's just a Bose-Einstein Condensate. These projects take time. While we are enamored with this BEC project, some poor grad student is working on carbon doping. Higher doping might improve the world of electronics far more than another optical computer claim.
I visited Hau's website and did, though, enjoy her papers [harvard.edu]. I just don't think the press release accurately portrays the low engineering potential of this work.
Re:If you overclock it too much... (Score:2, Interesting)
Re:I am a skeptic (Score:5, Interesting)
While those statements are true, I'm not sure if it's really legitimate to say that those wavelengths will work well inside a computational device.
Calling 13nm 'extreme ultraviolet' is marketing--those are really soft x-rays at that point. You're getting into photons that are inconveniently energetic. That's fine if you're doing lithographic etching of chips, but murderous on your hardware in daily operation.
We also don't have light sources capable of anywhere near the appropriate level of miniaturization for those very short wavelengths. Constructing one large EUV source for a chip fab plant is a very different engineering problem from constructing hundreds, thousands, or millions of such sources on each chip. The optics also get much more complex, expensive, and exotic as you move to shorter wavelengths. Once again, things that can be done in a billion-dollar chip fab are quite different from things that can be done on a hundred-dollar microchip.
We all live in a BE Condensate (Score:2, Interesting)
General Relativity Called... (Score:2, Interesting)
You realize the light is basically the fabric of space vibrating. To slow down light requires either distorting space, or slowing down time. (Time slows down in the presence of mass because mass bends space, forcing it to travel faster.)
Re:Refraction = slowing? (Score:4, Interesting)
No, it is. Mentioning refraction is a little odd, as refraction is caused by the slowing of light, not the cause of the slowing of light.
Once you're out of free space, the speed that an electric field can move can be hugely affected by density, etc.
Think of it this way: in a high optical density material, light is so slow because it has to drag electrons around as it moves. Light's an electromagnetic field, after all, and electrons have an electric field.
Now, you could *also* consider on a very, very small scale (sub-sub-atomic) that the photons are in fact still traveling at the speed of light - it's just that they're interacting so often with the electrons present that their net speed is very, very, very low.
Re:I am a skeptic (Score:3, Interesting)
Parent's parent's point about high-energy is that if your signal is strong enough to begin with, you might be able to finish the computation without amplifying it. In practice, this does not happen. Google "pass-gate" logic to learn how to use transistors as switches and how limited (and slow of a solution it is).
Second, the creators of this techology are scientists not engineers. Scientists are notoriously good at making one of something. In the real world we have to deal with parameter variations. Variability during manufacturing, variability in materials + contaminants, variability in operating conditions.
How much variablity you support relates directly to the cost. When you talk about biasing a mosfet to be an OR gate or an AND gate you give the engineer in me a heartattack.
What you're proposing is to throw-away the digital abstraction and introduce two-sided constraint assumptions. As a first guess, that seems reckless until you do a _very_ thorough analysis.
You've also not given a proposal for making an optical latch. No latch, no go--unless you're ready to dispose of the synchronous design abstraction as well.
If you're really serious about abandoning all of those assumptions, you should read "Asynchronous Pulse-Logic" (Kluwer Academic Publishers,2002) to get a feel for the formalism you have to develop to have a notion of "engineering soundness" for what you propose.