Optical Computer Made From Frozen Light 441
neutron_p writes "Scientists at Harvard University have shown how ultra-cold atoms can be used to freeze and control light to form the "core" - or central processing unit - of an optical computer. Optical computers would transport information ten times faster than traditional electronic devices, smashing the intrinsic speed limit of silicon technology. This new research could be a major breakthrough in the quest to create super-fast computers that use light instead of electrons to process information. Professor Lene Hau is one of the world's foremost authorities on "slow light". Her research group became famous for slowing down light, which normally travels at 186,000 miles per second, to less than the speed of a bicycle."
Cold Matters when it comes to Overclocking ... (Score:5, Informative)
BTW, for those interested, here's a direct link to the "Light at Bicycle Speed ... and Slower Yet!" presentation [harvard.edu] - I was travelling about that speed in
my coldest car during a Colorado snowstorm. [komar.org]
TFA - has popups (Score:1, Informative)
Scientists at Harvard University have shown how ultra-cold atoms can be used to freeze and control light to form the "core" - or central processing unit - of an optical computer. Optical computers would transport information ten times faster than traditional electronic devices, smashing the intrinsic speed limit of silicon technology.
Striking research and developments
News archive
This new research could be a major breakthrough in the quest to create super-fast computers that use light instead of electrons to process information. Professor Lene Hau is one of the world's foremost authorities on "slow light". Her research group became famous for slowing down light, which normally travels at 186,000 miles per second, to less than the speed of a bicycle.
Using the same apparatus, which contains a cloud of ultra-cold sodium atoms, they have even managed to freeze light altogether. Professor Hau says this could have applications in memory storage for a future generation of optical computers.
But Professor Hau's most recent research addresses the issue of optical computers head-on. She has calculated that ultra-cold atoms known as Bose-Einstein condensates (BECs) can be used to perform "controlled coherent processing" with light. In ordinary matter, the amplitude and phase of a light pulse would be smeared out, and any information content would be destroyed. Hau's work on slow light, however, has proved experimentally that these attributes can be preserved in a BEC. Such a device might one day become the CPU of an optical computer.
Traditional electronic computers are advancing ever closer to their theoretical limits for size and speed. Some scientists believe that optical computing will one day unleash a new revolution in smaller and faster computers.
Professor Lene Hau is Gordon McKay Professor of Applied Physics & Professor of Physics at Harvard University.
Re:I am a skeptic (Score:5, Informative)
I am not sure what you meant by this. Modern photolithography (used in production) has optics which works well at the 193nm wavelength. EUV which is lot more complicated has optics which works all the way to 13nm wavelength.
The speeds here are limited by the gate speeds of the electronics, just like normal computers.
I think you meant interconnect delay and not switching speed of a transistor. State of the art and next generation transistors can switch in a fraction of a picosecond. On the other hand interconnects don't scale well and are the bottleneck.
Optical interconnects can break even for clock distribution were skew & crosstalk are important and the network has lot of capacitive load. That, in my opinion, will be the first place where optics will enter into microprocessors.
Re:nature abhors a vacuum unless it's a dirt devil (Score:5, Informative)
The c in E=mc^2 (or E^2 = M^2c^4 + p^2c^2) refers to an intrinsic property of spacetime. Bose Einstein Condensates and so on don't really alter that. One way to think about it is to stop with the 'slowing down light thing', and instead conceive it as the BEC swallowing up photons for a while, storing the information, and then reconstructing a new photon which is exactly identical at the end. This is pretty much the same, because in QM, you can't really track anything exactly, and you definitely can't distinguish between objects with the same properties.
Re:Quick Reflection on a Slow Mirror (Score:4, Informative)
ENIAC: 1946
Transistor: 1947
Re:Can a physics geek explain how you "freeze" lig (Score:5, Informative)
Hype in search of funding Dollars (Score:5, Informative)
The title of this post clearly reads:
Science: Optical Computer Made From Frozen Light
We don't even have a diagram for a logic gate (or at least none are presented in the article) just some supposition in the article that such a thing could be used as a component. As for the 10x faster, where the hell did this number come from? Even if Moore's Law is slowing down (don't nit pick about it be about the number of components on a chip) it will make this "smashing" 10x advantage moot. Perhaps they refer to the speed of light in free space as opposed to signal speed copper. But even this doesn't make sense because signal speed in copper is about c/3.
What really maters is how fast a gate can be made to switch, how easy it is to fabricate enough of them to do something useful, and how close you can pack them together. Until someone can put down on paper the diagram of how this thing would work it is pointless to posit that it would be 10x faster.
Usually for these Pie-in-the-Sky type hype offerings it is common to claim 100x or 1000x or 1,000,000x times.
That BSEs might be used someday as parts in a Quantum computer would be a completely different thing, and those calculations that could be done quantumly would be trillions of times faster, but only for very specific algorithms. This article is not talking about that possibility, but classical computing and I think they have a lot of work to do just to demonstrate a single working component. Let alone claim BSE computers are here or just around the corner.
Re:Can a physics geek explain how you "freeze" lig (Score:2, Informative)
2)Really weird phyics like this doesn't start happening until things get really cold. Think tenths or hundredths of a degree above absolute zero. Of course, since energy and temperature are related concepts, at absolute zero, there is no energy, and nothing moves.
3)Relativity is still in effect. In fact it makes a lot of sense here. Less temperature = less energy (e). the speed of light (c) decreases at the same rate as the square root of e. At asbolute zero, e=0 c=0 m=infinity. Time has no meaning to light. Time only slows down/speeds up when your velocity changes with respect to the speed of light. If you were in the supercooled state, time would in fact slow down. The formula for time dialation is here: t'=t(1-(v^2/c^2))^1/2
4) At 1 Kelvin (still colder than space) everything works normally.
5)At ultracold tempearture, Einstein predicted that really funky things would happen. Matter as we think of it tends to break down. It's called the Bose-Einstein condensate.
Re:Can a physics geek explain how you "freeze" lig (Score:2, Informative)
To freeze light, you reduce the temperature of the medium it travels in. When this gets really, really cold, because of quantum uncertainty, the whole lot stops acting like normal atoms at all, but as a single, big ball of stuff, following a set of mathematical laws known as Bose-Einstein statistics.
A quick digression. How does light travel through the air? Photons and electromagnetic waves are only part of the action. Almost inevitably, a photon hits an atom of air in between. When this happens, it gets absorbed as energy, and this energy gets re-emitted as another photon. Due to the laws of physics, the probabilities are that the emmitted photon is like the original photon. So, measuring from the large scale, light seems to have been slowed down.
My understanding is that this is the same when you send light into the BEC, only that the entire BEC acts like an atom. Freezing light then, is to stop the BEC from re-emitting indefinitely, and just store the properties of the photon.
This has no effects on relativity. And it shouldn't affect our perception of the universe, because BECs are very fragile, and so probably rare.
Re:I am a skeptic (Score:5, Informative)
Not precisely correct. Most of the optical switches that Intel was developing back in 1999-2000 used evanescent modes to propagate along phosphorous-doped silicon waveguides with widths in the
2) There are no good nonlinearities. Anyone can make a linear OR gate optically, but to function as an effective digital technology you need nonlinearity and level restoration. This is missing in pure optical systems, except at very high power levels. The high power levels imply low density. There are some optical gates which process data in "femtoseconds," but ask them how long it takes to get to the next gate. Maybe someday someone will invent a great, low power, fast, optically nonlinear material. Don't invest in it yet.
Can you expand on this a bit? I'm confused as to how releveling implies high powers. Are you saying that the need for additional power input in order to improve the eye is prohibitive? Are you talking power input or optical power density?
Secondly, the gate region of a MOSFET (if doped appropriately to make the energy levels right) is an optically nonlinear material that makes a great switch. By setting the appropriate bias levels statically, one can change an optical OR gate into an AND gate into a NAND gate on the fly. While the switching rate is in tens of gigahertz, the reconfiguration rate is much slower, in the megahertz range, because you have to bleed off the common-mode biasing caps with another circuit and this takes longer. You can even have a buffered feedback circuit that does dynamic pre-emphasis over a few bits at a time. So what you have is an electrically biased and reconfigurable switch where the data path is all optical.
3) The serious workers are now mostly working in combined electronic/optical modes. The speeds here are limited by the gate speeds of the electronics, just like normal computers. You have to then ask if optics is a good (cost effective, space efficient, low power...) replacement for wire. Ultimately, the answer is probably yes, but there's an awful lot of work to do before that's true (for the distances of a few centimeters in high density computers, that is).
I agree. I'll bet that Intel's trying to perfect on-chip semiconductor lasers fabricated in their existing process. I'm pretty sure that they've nailed optical recievers in their process already.
Re:May I be the first to say... (Score:1, Informative)
Metamoderate. [slashdot.org] That way you're more likely to get moderation points. Then you can counteract the moderations that you find incorrect.
As an added bonus, you might get to metamoderate the comments you disagree with. (Such as the comment that your comment references)
Re:Speed of Light? (Score:5, Informative)
If you want a picture of what's really going on, think of it this way: *photons* (the fundamental particles of light) always travel at the speed of light, c, as measured by any observer (like relativity says!). However, in optics, when we talk about "light" we don't usually mean individual photons, we mean a massive collection of them, and thus things change a bit. In vacuum, a light beam will travel at exactly c since all the photons travel at c. In a material, however, the photons are continually scattered by the atoms that make it up. These countless scattering events (which are essentially absorption and re-emission events) interfere and generate the final light-beam that we macroscopically observe. The interaction between the photons and the electron clouds in the material lead to time lags, if you will... so that the net macroscopic velocity appears reduced (even though, in principle, the photons travelling from one atom to the next were going at c).
There are experiments where light is "slowed" or "stopped" or even moved backward... and some where light even travels "faster than light." But what is travelling at these speeds is the emergent phenomenon (the envelope of the photon interference pattern), not the individual photons that make it up. Thus, even if the envelope of a photon wave pattern is travelling faster than c (i.e.: the calculated group velocity is >c), you still can't send a signal faster than c. The "no energy/signal can go faster than speed of light" rule is very much maintained. For more information on this, google the difference between "phase velocity" and "group velocity" of light, which will give you some insights.
The problem is that when introductory physics is taught, the difference between these different velocities is not mentioned (phase velocity != group velocity != photon velocity) And of course, news articles never mention it!!
Re:Can a physics geek explain how you "freeze" lig (Score:4, Informative)
regarding point (3)-- "ess temperature = less energy (e). the speed of light (c) decreases at the same rate as the square root of e." I call shenanigans. c is a constant here to relate the conversion of mass to energy (and vice versa). E does NOT reference heat energy.
If it did, the speed of light would increase for hot objects (and on hot days). Time effects would be experienced by stars and nuclear reactors.
Re:Moore's law strikes again (Score:5, Informative)
Tell me about it. For a website that fashions itself as one for nerds, the speed of bicycle thing sounded as bad as Opera talking physics.
Is it so hard to specify the specific value to which the beam of light was slowed down to? At the very least, they could have linked to a slightly more detailed article on freezing light [physics.hku.hk].
Almost sounds like some arts major posted something in physics that went over their heads
Re:I am a skeptic (Score:2, Informative)
That idea floated in 60s-70s and they know it doesn't work that way. Just because you can dope the gate, it doesn't make it optically non-linear. There may be some trap-based transitions, but the gain would be too low for any useful computation. The band gap changes by atmost few millivolts. Please do some literature homework before posting. The reconfiguring time is a separate issue.
Re:Talk about a computing revolution (Score:3, Informative)
Re:Moore's law strikes again (Score:2, Informative)
(1 attoParsec) / (0.000001 fortnight) = 0.0836939721 ft / s
Re:I am a skeptic also (Score:2, Informative)
As you say, there are no good low power nonlinearities. High power nonlinearities are easy to find -- the vacuum is nonlinear at high enough power levels. But I know of no optical nonlinearities which are functional at low -- or even modest -- power densities. This is important because it affects the packing density of the circuitry (see below).
The article uses a faulty metric -- the speed of propagation of signals is not the important criterion for designing a computer, but rather the delay in reaching the next gate. This depends as much or more on the density of the components (and the dimensionality of the construction technique) as it does on the speed. If components are spaced a foot apart, then it takes more than a nanosecond to reach them no matter what. While it is true that properly buffered CMOS on-chip wiring is only about 3% of the speed of light, the density (and required low power) of CMOS allows billions of gates to be reached in a nanosecond. Optical technology has a LONG WAY to go in reaching this point, let alone exceeding it. By then, 3-D silicon will make these numbers dramatically higher.
Also, superconducting on-chip interconnect will make on-chip silicon wiring dramatically faster (10x?) and is a much much easier technology that BEC.
But the physics is sure cool.
Re:Bose Einstein Condensate? (Score:3, Informative)
LIGHT IS A FLUID IF IT CAN BE FROZEN (Score:2, Informative)