Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
DEAL: For $25 - Add A Second Phone Number To Your Smartphone for life! Use promo code SLASHDOT25. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. Check out the new SourceForge HTML5 internet speed test! ×
Science Technology Hardware

Storing Light In Chips 164

Roland Piquepaille writes "Recently, researchers have "stopped light" by storing light pulses in hot or extremely cold gases (check these former stories on Slashdot or at BBC News Online). Now, scientists from Stanford University have devised a method to store light pulses under ordinary conditions. In Light-storing chip charted, Technology Research News says this opens the way for all-optical communications switches, quantum computers and quantum communications devices. The researchers plan to demonstrate this technique by trapping microwave signals within a year. They think that a prototype which works at optical frequencies could be made in two to five years. This overview contains more details and references."
This discussion has been archived. No new comments can be posted.

Storing Light In Chips

Comments Filter:
  • Schrodinger (Score:5, Interesting)

    by andy666 ( 666062 ) on Saturday February 21, 2004 @09:44AM (#8348744)
    This was predicted by Schrodinger in the 30's - it really took them a long time to do it.
    • by Anonymous Coward on Saturday February 21, 2004 @10:10AM (#8348824)
      And to this date, nobody actually *tried* tying buttered toast to a cat's back, for the hovering-cat effect!
    • Re:Schrodinger (Score:5, Insightful)

      by Winkhorst ( 743546 ) on Saturday February 21, 2004 @10:12AM (#8348833)
      Hey, it took centuries to get around to using Copernicus's orbital equations with spacecraft. This is the beauty of basic research. It eventually has a practical use, but you can't base its validity on how long it takes to use it. And you have to distinguish between research and the ability to invent something. As John W. Campbell once pointed out, the Classical Greeks had everything necessarily to invent the phonograph, though it wasn't until Edison that somebody got around to doing it. In that particular case, it was the mental rut into which the Greeks had worn themselves that kept them from making much practical progress, thus leading to the return to power of irrational religion and the eventual rise of the Dark Ages.
      • Re:Schrodinger (Score:5, Informative)

        by andy666 ( 666062 ) on Saturday February 21, 2004 @10:36AM (#8348915)
        It isn't Copernicus' equations that are used for spacecraft, but Newton's F=ma, Newton's law of gravitation, and an occasional use of General Relativistic corrections.
      • yes Mental RUT !!!!
        that is what happened to the greeks.

        The fact the Romans dominated the world for nearly another millenium is irrelevant.

        As well as the fact that pagan culture was lost to christianity centuries before the dark ages came.

        And the fact that in the whole eastern side of the empire things wen't merely on for other hundreds of years...

        By the way, if you go read a history book take your time to check IF technology stood still from 1200bc to 200bc in Greece.

        get a clue!
    • "Storing light..!" what immediately come to my mind is an eternal bulb ! that doesn't fuse or require power :-)
    • Dude, I am obtaining my PhD in obtics and mirror design and have spent many years studying light at a certain giant engineering university. I am sorry to burst your bubble, but your statement is an absolute lie. This prediction was instead made by Plank in the early 1900's. Schrodinger, who hated Plank, attempted to steal all of his ideas and stifle his development... much like Newton stealing calculus from Leibniz.

      Please from now on, do some research before you post
  • Not hard (Score:5, Funny)

    by Squareball ( 523165 ) on Saturday February 21, 2004 @09:46AM (#8348752)
    Storing microwaves within a year isn't very hard. I mean a year is huge!
    • Re:Not hard (Score:1, Funny)

      by Anonymous Coward
      Yeah, but if you store them in a chip, it's much easier to find them -- provided that you don't lose the chip, of course.
    • Yeah, but if you store them in a chip, it's much easier to find them -- provided that you don't lose the chip, of course.

      Yeah, but who microwaves chips...I use salsa.
    • My Timex watch has been doing this for years.

      I just press the "indigo" button,and if by magic, it releases the light it has stored. Amazing!

    • Re:Not hard (Score:3, Funny)

      by Squiffy ( 242681 )
      Last year I chipped a store with a microwave.
  • by Aurix ( 610383 ) on Saturday February 21, 2004 @09:46AM (#8348753)
    Anyone know if this would help out in display technologies?

    Ie, instead of refreshing a CRT, if the light was held until it was no longer needed?

    Might pave the way to some new display technologies =)
    • by cubic6 ( 650758 ) <tom@losth[ ].org ['alo' in gap]> on Saturday February 21, 2004 @09:48AM (#8348758) Homepage
      Well, if the light is held, it's not getting to your eyes, and thus not making a visible picture. So in that particular instance, I would think that this wouldn't help very much.
    • Well, no, because if light's held, you won't see it.

      The image you see on the CRT is from the phosphors emitting light. If the elements of the screen held light, you would see a black image until the light was released.

    • I think they call it an off switch, mine has one right on the monitor, when you don't need the photons - you push the little button and it stops producing them.
    • I would have thought it would help with things like buffers for light based network switches...
  • quantum? (Score:4, Insightful)

    by freerecords ( 750663 ) <slashdot@frBLUEeerecords.org minus berry> on Saturday February 21, 2004 @09:52AM (#8348771) Homepage Journal
    quantum computers are still, and will be, a very very long way off. it is not enough to say that one single development will speed their coming, rather one obstacle will be replaced by another - sod's law
    • There must be a certain number of obstacles that must be overcome, so each one that is overcome will move development on. It will only speed it up if it is overcome sooner than expected, of course.
  • So far I've seen.. (Score:5, Interesting)

    by p4ul13 ( 560810 ) on Saturday February 21, 2004 @09:54AM (#8348777) Homepage
    So far the comments have talked about using this for communcation / processing devices. Some have mentioned using this tech as weapons and such.

    I'm wondering if light or other waves stored in such a fashion could be used as a battery of sorts.

    • I've always wondered if we could have laser powered jet engines. Instead of using gasoline to make the air expand, we'd use UV lasers (or whatever absorption band wavelengths oxygen/nitrogen have) to heat up the air.
      • Af far as non combustion jet engines go, they tried nuclear ramjets back in the 60's. They actually worked fairly well. Rather than lasers, suppose you had a few megawatts of microwaves being beamed up to a rectenna on an airplane, giving the airplane quite a bit of electrical power. The air in a jet engine could be heated by super-hot heating coils. Or else, you set alight a smallish pilot flame of acetylene or something, and pump the microwaves directly into the combustion chamber. This would create very
  • by enderanjin ( 753760 ) <enderanjin&gmail,com> on Saturday February 21, 2004 @09:54AM (#8348778)
    When can we step back into the past and correct someone else's mistakes?
    • Re:Quantum Leaping? (Score:2, Informative)

      by strike2867 ( 658030 )
      Moderators, quantom technology has been connected very closely to time travel. For more information look at any quantum book. I would suggest Stephen Hawkings old book, A Brief History of Time [amazon.com].I dont have Mod points anymore so I cant correct this.
    • A book by Michael Crichton sums up time in a unique way. Obviously, hey he didn't come up with the idea. But he did write the fiction book around the principle.

      According to the book, you can never travel back or forward in time within your own universe. But, you can teleport to another parallel universe (one of nearly an infinate amount). And of these universes, they may either be ahead or behind in progress in relation to general age of your own universe.

      This idea really gets around paradoxes. For exampl
      • I read that book. It was fiction. Although he always claims to be using all these reasorces, he is usually getting the wrong conclusion from them. He uses quantum mechanics incorrectly to prove his multiple universe theory. I think that type of argument is called Chicken nest or something like that. IMHO the last good book he wrote was Jurasic Park. The 3 after that sucked.
  • by Mysteray ( 713473 ) on Saturday February 21, 2004 @09:55AM (#8348784) Homepage
    Keep in mind that this is only theoretical. The researchers plan to demonstrate this technique by trapping microwave signals within a year. They think that a prototype which works at optical frequencies could be made in two to five years.

    Does this sound like another one of those "breakthroughs" in optical/quantum computation where prototypes are "just around the corner" and commercialization is "just a few years away", yet it never happens?

    Tell me how this time it's different. Does it work on standard fab processes?

    I would really love a CPU with a terahertz clock. I guess it would still be I/O bound, though.

    • It may be slightly IO bound, but we could always use a microwave bus and solid state memory and AGP 16X graphics cards. Plus with a huge amound of cache (say 16MB) the computer would be really fast without too many bottlenecks.
    • by robbot ( 606831 ) on Saturday February 21, 2004 @10:11AM (#8348830) Journal
      Yeah I was excited reading the article until this quote.

      "The work would have been more impressive had the authors demonstrated the stopping of light experimentally, he added." Raymond Chiao, a professor of physics at the University of California at Berkeley.

      Yup one of those 2-5 years things again, like so much else...
      • yep... perhaps completely off-topic, but I invented a new technology, which is "5 years off", however, I actually have code, I have a beta, I have simulators, and it's actually been shown.

        so what does it take to get something like this off the ground? Seems like the only way sometimes is lots of media/marketing hype to get a bunch of cash so you can actually do the work.

        I have all this stuff redy to show (have shown several times), and I'm still broke and unemployed. Give me one good reason I shouldn't be
      • Can you say "Room temperature superconductors"? I have a feeling this is a lot further off than 2-5 years. Just vaporscience now.
    • Actually, I think the first such "breakthrough" was when they managed to stop light in a Bose-Einstein Condensate, there-by proving that it was possible, under extreme circumstances. This is a much more practical way of doing it. If they succeed, then we will move beyond the "breakthrough" into the "practice" part. It could be very good.

      Sorry for any misspellings or typos. I just crawled out of bed literally.

    • [sarcasm]
      Oh yeah, they really shouldn't have published this crap result, it simply doesn't live up to the hype. I mean, it's like the special effects were badly done.
      [/sarcasm]

      What's this, do you think that scientific progress should be kept in the shadows until it has reached a certain level of shock value?
      Do you *really* intend to sound as if you were disappointed, just because someone's kept busy and learned something that could be worthy of sharing? Because if you do, I believe there are issues with
      • No, you misunderstand me. I think it's great that we have pure researchers pushing out the limits of human knowledge, and am grateful for their work. I certainly am glad they have results to publish.

        I think the main problem is that we have a popular science press that, in talking down to its readers, always reports pure research as if it were applied research. While fun to read, the effect can be that technology becomes over-promised and over-hyped too early in its development. This can cause good tech to

        • I'm not sure which worries me more: the risk of inflating the public interest in the technologies you mention until it becomes disinterest, or the fact that these technologies will materialize eventually regardless of whether the public wants it...

          But I do see your point now.

    • Unless researchers announce preliminary findings 2-5 years in "advance" of a prototype how do you expect them to get funding? Investor mind-reading equipment to access scientist brains is still 10-15 years off.
      • Unless researchers announce preliminary findings 2-5 years in "advance" of a prototype how do you expect them to get funding?

        There's a big wide grey area between methodical, conservative science and over-hype. Within this gray area is also a fine line that can only ever be seen in retrospect. There definately needs to be a safe path for moving ideas from pure research through product development to commercial appliction, and capitalism is probably the worst way to do that, except for all the others.

        Loo

  • by G4from128k ( 686170 ) on Saturday February 21, 2004 @10:03AM (#8348803)
    I wonder if optical will simply be bypassed by other, already denser technologies. Semiconductor feature sizes are an order of magnitude smaller than a wavelength of light -- giving them at least a 100-fold advantage (assuming the an optical computer could even have useful feature sizes at wavelength scales). Commerically available HD densities are over 100 bits per micron-square. And this does not even count on any new nanotechnologies in circuits or storage.

    I'm sure that optical will have a role in the future. The ability to send ultrahigh bandwidth signals over long-distance fibers is extremely valuable. All-optical switching/routing would certainly improve latency. The ability of light beams to nondestructively pass through other light beams also makes it ideal for denser chip-to-chip and device-to-device interconnects. Finally, holographic memory storage migth have a future (although it would not surprise me if current HD densities are probably on par with expected future holographic information densities)

    That's why I doubt that we will see an all-optical future. Other technologies already provide better densities in circuits and storage. Only in the realm of communications, does optical really shine.
    • by AbbyNormal ( 216235 ) on Saturday February 21, 2004 @10:16AM (#8348849) Homepage
      Dude, check out my light harddrive.

      ..Opens case, goes blind and loses content of computer
    • by polv0 ( 596583 ) on Saturday February 21, 2004 @11:13AM (#8349038)
      I wonder if optical will simply be bypassed by other, already denser technologies.
      There are two primary restrictions on current micro-processors. One is our ability to manufacture large deformity free wafers of silicon. The other is the excessive heat generated by the electricity. Both have been slated to slow our progress along Moores Law using conventional micro-processor technology.

      What are the alternatives? It is possible to build deformity free cubes of silicon. However, in a 3-dimensional chip the heat generated (grows with the cube of the height of the chip) is dissapated through surface area (grows with the square of the height of the chip) so it compounds the second problem.

      A probable alternative is the substitution of man-made diamond wafers for silicon. Diamond is far more heat-resistant than silicon, and can be created deformity free by plasma layering processes. Unfortunately the technology is still nacent and wafer sizes are still miniscule.

      Optical computation would clearly provide a heat advantage. Imagine the newest supercomputer powered by a flashlight. But regardless, the greatest advantage of this technology, if realized and implemented for even a small set of basic algorithms, will be quantum computers.
  • by polyp2000 ( 444682 ) on Saturday February 21, 2004 @10:05AM (#8348807) Homepage Journal
    The article gives the impression that these chips are storing or freezing light. I dont see how this is possible. If they were truly "storing" light how would one know? The way I see it, is that if you can "see" or "observe" light then by definition the light must be escaping.
    • by Angstroman ( 747480 ) on Saturday February 21, 2004 @11:00AM (#8348996)
      Yes, the concept (it is only a theoretical concept, not a chip, in the paper) does store the light. When the optical pulse is completely within the postulated structure (meaning only a very short pulse can be stored), a modulation of the refractive index causes the fields associated with the pulse to be stored in the internal cavities of the crystal. Reversing the refractive index change causes the stored fields to reform a traveling wave, which exits the structure. The way that you know that the pulse has been stored in the computer simulations is that after the first refractive index change, nothing comes out of the structure. After the second change, a pulse emerges that has the same shape as the one that was sent in.
      • So if it has to be manually reversed, could you use it as a storage device? Eg, to store sunlight, or laser light for example?
        • Yes, I believe the intention is to allow such a structure to be a storage device. However, one should probably think of it more as information storage than energy storage. The entire light pulse must fit physically in the device in order to achieve the results when the refractive index is modulated. That implies that only very short pulses can be stored, since the pulse speed before modulation is a significant fraction of c.
    • by strike2867 ( 658030 ) on Saturday February 21, 2004 @12:26PM (#8349431)
      Light is just energy. Think about when light passes through glass. Do you think it just stops on one side and then appears suddenly on the other side out of nowhere? The molecules in the glass store the energy of the light, then pass it onto the next molecule. Therefore for a very short amount of time that molecule stored the light. But what seems to have been done here, is that the scientists were able to keep the molecules in that excited state for a longer amount of time. BTW I did not RTFA, used to be a Phys Eng major.
    • The article gives the impression that these chips are storing or freezing light. I dont see how this is possible. If they were truly "storing" light how would one know? The way I see it, is that if you can "see" or "observe" light then by definition the light must be escaping.

      A better way of describing what this stuff does is that it records the state of the wave at every point in the medium. When they want to regenerate it, they recreate the pulse using that information. Effectively, all they're doing is
  • Marketing (Score:1, Funny)

    by zz99 ( 742545 )
    So soon the computer industry will see the same marketing as for soft drinks...

    I can picture the billboards: Buy a computer with a Pentium Light(tm) inside
  • Another Step (Score:5, Interesting)

    by Gyorg_Lavode ( 520114 ) on Saturday February 21, 2004 @10:08AM (#8348819)
    Another step in the right direction. It seems more and more like optical processing is the way that computers are gong to go in the future. We all know that the current (no pun intended) electrical processors are not going to be sustainable. Primarily for heat, lithography, and quantum interactions on the traces.

    This seems like a step in the right direction. I wonder if it can be used for memory or just buffers of a sort. Don't get me wrong, I don't think anyone expects a transition from electrical computers in the next decade, but the breakthroughs on the optical front seem to be accelerating.

  • by Anonymous Coward on Saturday February 21, 2004 @10:27AM (#8348884)
    What I think about is the future ability to create custom and finely tuned diamonds with different amounts of "impurities" grown into it with .30nm amounts of detail.

    What if you can not only use diamonds for electronic media, but also use the refractive nature of diamonds for storing and moving light?

    Couldn't the different light "switches" and other networking technology be added into diamonds as they are grown?

    Could you use something like that to grow 3 dimensional computer chips and storage media?

    Also aren't diamonds pretty much destruction proof... could you were a future computer in a ring or a harddrive in a earing?
    • And there you have how easy it is to project what "could" happen. The harder part would be learning "how" to fine tune the structure of the impurities, etc.

      "Also aren't diamonds pretty much destruction proof"
      No. They chip.
    • The refractive properties of diamond as far as I understood were fixed. The crystals used in the paper need to be able to have their refractive index changed easily. Normally you can't grow on a large industrial scale crystals with non uniform structures. Silicon chips for example are grown uniformly then processes such as lithography are used to add features such as transistors. Diamonds aren't destruction proof. Diamonds are hard, that is to say if you try and scratch them they don't. But they are a
  • ...The silmarils!
  • Speed of light? (Score:1, Insightful)

    by despik ( 691728 )
    IANA physicist, so I'm probably missing something here, but I thought that the speed of light [google.com] was actually a constant. Now, I did RTFA, and it states: The researchers' simulation shows that light pulses can be slowed to less than 10 centimeters per second. What's up?

    Also, as for storing light temporarily -- has anyone considered using a "mirror trap", in which the light would bounce around until the trap was opened?
    • Unless you make a mirror trap out of 100% reflective mirrors, then you're not going to trap light. :) ~X~
    • Re:Speed of light? (Score:5, Informative)

      by Weird O'Puns ( 749505 ) on Saturday February 21, 2004 @11:46AM (#8349252)

      If you had just looked at some links in your Google search you would have found this:

      To be precise, what we usually call the "speed of light" is really the speed of light in a vacuum (the absence of matter). In reality, the speed of light depends on the material that light moves through. Thus, for example, light moves slower in glass than in air, and in both cases the speed is less than in a vacuum. Link [utk.edu]

    • IAA(lso)NAP(hysicist), but no, the speed of light that is so oft-quoted is light's speed in a vacuum.
      when passing through various mediums light can move from 0-c.

      to add to any potential confusion, there's some evidence that the fine structure constant, which determines EMF strength and thus 0-c, has changed a bit over the universe's history..but last i knew anyway these claims havent been 100% proven.
    • The constant c, as in E=mc**2, represents the speed of light in a vacuum. It is, according to Einstein (and paraphased by me), the speed limit of nature.

      It's long been known that light travels slower through a medium. It is this slowing that causes the bending of light rays called refraction. Refraction is the property of light which allows for such things as lenses and rainbows.
    • Mirror Trap? (Score:4, Informative)

      by Sunlighter ( 177996 ) on Saturday February 21, 2004 @03:03PM (#8350439)

      No, here's what you need. You take a microwave transmitter and blast a second or so of bits at the moon. Wait three or four seconds, it echoes back. Receive it. Correct the errors (you did use error-correcting code, didn't you?), then send it to the moon again. And when it echoes back transmit it again. And so forth. First trick: you can correct and retransmit simultaneously with the reception. So you can have more data in flight than you have memory for on Earth. Second trick: you'll note that the power you get back is far less than what you sent out. But you can still retain the data. You have to act as a repeater, but that's all.

      You could do this with mirrors, but the mirrors will probably be too close together to store very much. Still, a laser, and a nearly 90 degree angle, and the light will zig-zag a lot, and you might have a few hundred feet before you need a repeater. Damned dusty mirrors! Damned non-transparent air!

      Third trick: with the moon, you now have a sort of bubble memory, but it's over 100,000 miles long. You could do the same trick with 100,000 miles of fiber-optic cable. But if you could slow down the speed of light you could use shorter cable (or store more in the same cable without having to drive the frequency and the bit rate really high). Also, you could shorten the period, which means your data is available sooner.

      If you can really slow down light to a few cm per second, then you can store a lot of stuff. But you will need power for the repeating.

      (What would be better is to make windows out of this stuff. You could look out the window and see what was happening outside yesterday. But imagine the solar power applications if you made the glass twelve hours thick instead of twenty-four. Sunlight would shine in during the daytime, and come pouring out at night!)

  • My flashlight has been able to store light in it for quite a while now. Just because they can do it on a chip now isn't a big deal.
  • I always envisioned some sort of window that passes light VERY slowly. Basically you take this window, stick it somewhere like florida for a while (years whatever) then you put it in a window. You see the sunshine and awesome views of florida until it runs out. At which point you swap it out to get recharged. It would be expensive but for buisnesses or something in a rainy area like Oregon (where I live) for instance.
    • So what happens if you're on the other side of the window looking in? Everyone will think there's some window recharging lab geek inside your house....
  • Hooray, after all these years of being told that chips are fattening someone has finally managed to make chips with light in!

    Now if someone could just replace the sugar in Coke with light and I could eat my standard programmer's diet without getting fat enough to break my chair.
  • If only they could make it capture the light from the goatse.cx guy before it reaches my eyes...
  • by Anonymous Coward on Saturday February 21, 2004 @12:01PM (#8349317)
    I have been storing light in my fridge for years. Even when it's dark outside and I check, it is still there...
  • by Kegetys ( 659066 )
    Funny how they keep using that 'A' letter (the one with the ring on top, Slashdot seems to auto-convert it to a normal A?), which is spelled like 'O'... So when "Stargate" is spelled "Stargote", this new one, "Stargote Atlontis" sounds even more crazy. I hope the show itself will be good though, like the original Stargate... :)
  • Laser in a box? (Score:1, Interesting)

    by Anonymous Coward
    I wonder if this might eventually be a way to get around the size and power limitations of lasers... You could create a burst of laser light using a big clunky machine, freeze it, then take the light pulse with you. If you had a bunch of these pulses stored in, say, cartridges, you'd essentially have a light, ultra-portable laser with little need for a power supply, albeit one that will produce a limited number of pulses.
    • As far as I understood there wasn't a massive problem with the power of lasers. The one in the lab opposite my lecture theatre can produce powers in the terawatt range. The main problem with lasers is as far as I can see is making ones that work at higher frequencies, hence higher bandwith for optical fibres etc. Of course it's much more fun just to get lots of terawatt lasers and see if you can get stuff to take part in fusion :D
  • by JoeCommodore ( 567479 ) <larry@portcommodore.com> on Saturday February 21, 2004 @12:56PM (#8349563) Homepage
    Technology Research News says this opens the way...

    I know I've heard this spin several times before on optical processors, and just about every new advancement touts such claims. So I ask when WILL we see 'the way' as actually being "opened???"

    Of course this reply opens the way for people to flame me silly. And that IS a fact!

  • I'll believe it when I see it. I still have a cold-fusion reactor sitting on my desk; it was supposed to work in a beaker!
  • I've had my microwave in storage for over a year. What's the big deal?
  • Every time I open the freezer door, there is light.

    Other light bulbs around the house seem to burn out all the time and my wife is always turning up the furnace - coincidence? I think not.

    Do lights last longer in the north? What's the deal with those Northern Lights I hear about?

  • Photonic Battery? (Score:1, Insightful)

    by Anonymous Coward
    An all-optical computer requires storing light with its wave state intact for signalling: either its envelope, waveform, spin states, or some other modulated state decodable as information. How about a material or device that merely stores the photons, as power? As we look at more efficient transmission of power derived from light (solar), or delivered as light (lamps or displays), the photon/electron conversion becomes a liability. It eats power, and constrains possibilities for the workings of the machine

We can found no scientific discipline, nor a healthy profession on the technical mistakes of the Department of Defense and IBM. -- Edsger Dijkstra

Working...