Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
Networking Communications Science Technology

"Time Telescope" Could Boost Fibre-Optic Communications 183

Posted by ScuttleMonkey
from the way-faster-than-88-mph dept.
An anonymous reader writes "A time lens can focus a chunk of time to a point, rather like a normal lens focuses light rays. Put two time lenses together and you can create what a Cornell University team calls a 'time domain telescope' which can magnify time. They sent a 2.5 nanosecond long light pulse, encoding 24 bits of information, into their time telescope. What came out on the other side was the same 24 bit pulse, but compressed into 92 picoseconds. Squashing more information into a light pulse could help to send more information via optical fibres."
This discussion has been archived. No new comments can be posted.

"Time Telescope" Could Boost Fibre-Optic Communications

Comments Filter:
  • salesman speak (Score:4, Insightful)

    by circletimessquare (444983) <circletimessquar ... m ['l.c' in gap]> on Monday September 28, 2009 @03:54PM (#29572085) Homepage Journal

    "A time lens can focus a chunk of time to a point, rather like a normal lens focuses light rays."

    no, its not LIKE a normal lens, it IS a normal lens. kind of like how "cloud computing" is the same client/ server model of decades past, a "times lens" is basically, uh, gee, a lens. but made sexy by introducing scifi fantasy terminology for the sake of grabbing attention

    • "cloud computing" is the "client/server" model like the iPhone is the "rotary phone" model. I certainly agree with your premise that people like to hype things, and this is just a lens with a fancy name. But "cloud computing" is a long-distant descendant of the "client server" model. They aren't the same thing anymore than a nuclear bomb is just "a really strong TNT bomb".
      • Re: (Score:3, Insightful)

        by maharb (1534501)

        Cloud computing is just client server model on a larger scale with new technologies to make it possible. They are exactly the same conceptually, the only difference is the specific technologies being used to complete the goal. Oh and one is a marketing buzzword used to generate interest while the other is a 'technical' description of a system.

        The only reason cloud computing is considered new is because of the scale it is being done on, the markets being targeted, and the technologies being used. So it ma

        • Re:salesman speak (Score:5, Insightful)

          by RightSaidFred99 (874576) on Monday September 28, 2009 @04:48PM (#29572791)

          Client/server is a communications model. Cloud computing is a business model, a management model, a deployment model, etc... You might as well say "networking" is the real concept, and that fancy "cloud computing" is just a PHB term for "networking". Let's just call cloud "computer networking!".

          Cloud computing isn't about a "client" and a "server". It's about moving more of your data and business processes off systems and software you support and letting someone else do it.

          Cloud computing will have client server components. So what? When I use my Xbox 360 to play games over the internet should I tell people I'm using a "client/server system" or that I'm playing my god damn Xbox 360?

          It's fun to mock the Latest Thing, and sometimes it deserves it, but cloud computing is not just a fancy name for Client/Server.

          • Re:salesman speak (Score:4, Insightful)

            by mevets (322601) on Monday September 28, 2009 @04:54PM (#29572861)

            Agreed, it is more like fancy name for a mainframe with RJE.

          • by maharb (1534501)

            The client server model is not a communications model, it is all of the models you described above for cloud computing. You are looking at the cloud from the point of view that the 'marketers' want you to.

            The 'cloud' consists of many servers that store your information, process your information, and deliver it to you when you need it over a network. This supports lower powered client hardware, centrally stored data, changes to code can be deployed to the server, etc. I can go on all day with properties of

            • Client/server is a characteristic used in almost any networked system today. I'm using client/server if I use BitTorrent. Is it useful or descriptive to just call BitTorrent "Client/server"? Isn't "P2P" more descriptive?

              The point of the cloud isn't even technical - it's a business process. Instead of paying for your own "client" and "server" you let someone else pay for and host the "server", "network", "platforms", and/or "software".

              Client server is so implicit in pretty much everything computing and n

              • by maharb (1534501)

                You are assuming that because I am calling cloud computing the same thing as the client server model that I think the term cloud computing is worthless. Despite already saying the opposite you continue to assume this fact. I agree 100% that cloud computing better describes a specific use of the client server model.

                You also fail to realize that I am defending the OP's analogy because it is perfect. Just as we name different types of lenses different things like glasses, telescopes, etc. They are all fund

                • All you SlashDot posters run together to me, so my points may not have been 100% directed at you.

                  My point is to question the assumption that anything that sounds "markety" is necessarily just a meaningless, made up term for something that already exists.

                  We see the same thing with "Web 2.0". Sure, it gets used and abused but it means something. It's not just "the same as the Web/HTML". It's more interactive, doesn't rely on full page post/redraw cycles, etc...

                  Everything that's networked in modern computin

          • Re:salesman speak (Score:4, Insightful)

            by commodore64_love (1445365) on Monday September 28, 2009 @05:30PM (#29573229) Journal

            >>>Client/server is a communications model. Cloud computing is a business model

            Whatever. It still reminds me of the hellish 1970s/80s VAX machines where you could only access your programs/data from a central source, and if that source or connection went down, you were out of luck. I was much happier when I got rid of that and exchanged it for a computer that ran its own software any time and any place I felt like it.

            • by j35ter (895427)

              It still reminds me of the hellish 1970s/80s VAX machines where you could only access your programs/data from a central source, and if that source or connection went down, you were out of luck.

              Hey! You obviously never had a teletype.
              And who goes down??? If yer TTY wouldn't type, you probably forgot to pay the phone bill.
              And yeah, with some practicing, you could decode the 75bps stream with your ears, sonny!

      • by arotenbe (1203922)

        But "cloud computing" is a long-distant descendant of the "client server" model. They aren't the same thing anymore than a nuclear bomb is just "a really strong TNT bomb".

        If "cloud computing" is so different from the client-server model (with the server being provided by someone else), then surely you can name some differences between the two models.

        Well?

        • by PitaBred (632671)
          The cloud theoretically doesn't have fixed resources, unlike previous excursions into hosted serving. You either had enough capacity for everything, or you needed a faster server that ran idle most of the time. The cloud concept really is a complete rethinking of server balancing by distributing both the software and the data as needed.

          But that's just what I get from a bit of reading. I'm not a cloud user, though it'd be something I'd look at if I had a load I thought it could benefit.
        • First and foremost, interoperability and standards. Was there a hosting service "in the 1970's" (apparently when everyone likes to pretend something non-new was _really_ invented_) where you could switch providers for storage, software as a service, manageability, etc...?

          Elasticity. Was there a notorious 1970's service where I could dynamically allocate network bandwidth, storage, virtual computers, etc... and choose from multiple vendors to do so? And old timers, don't bore me with claims about mainfram

          • by Nikker (749551)
            One way to see if the client / server model fits is to ask the question "If there is no I teraction from outside the cloud what does the cloud do?". Likely it will sort data, build indexes and gather more data based on what queries it anticipates next. If that works for you then Cloud Computing == Client / Server Model
    • by spun (1352) <`moc.oohay' `ta' `yranoituloverevol'> on Monday September 28, 2009 @04:12PM (#29572345) Journal

      So, a normal lens will compress a series of pulses into a shorter series? How, exactly? I didn't realize that normal lenses worked by exciting the atoms in a waveguide with an infrared laser.

    • by popo (107611)

      I think the word you're looking for is 'compression'

    • by blueg3 (192743)

      It's not a normal lens at all. An optical lens functions only in spatial domains, whereas this functions in the time domain. Granted, it does not "compress time", but that level of reporting is part for the course in science.

      If you know of a way of using optical lenses to turn a 1 GHz signal into a 2 GHz signal, do let us know.

      • by dwye (1127395)

        > but that level of reporting is part for the course in science.
        ^^^^^^^^^^^^^^^^^^^
        "par for the course" -- it is a golf analogy.

        Alternately, you could have used "part and parcel" of science reporting to imply that it is always like that except in very special cases like the No

        • Re: (Score:3, Funny)

          by blueg3 (192743)

          As it turns out, "part for the course" is "par for the course" when you hit one of the letters immediately adjacent to the "r".

    • A better analogue for cloud computing would be "outsourcing". It's not a client server model, it's a way to outsource the server part of the client server model. But if they called it outsourcing, people would think that all your data would be in an intelligible accent that kept telling you to check if the modem was plugged in.

    • by mea37 (1201159)

      The article and summary are horrible, but so is your understanding of what's going on. This is fundamentally different from a "normal" lense, but not in the way TFA suggests.

      This is not a "time iense" that focuses time in the way an optical lense focuses light. It is a time domain lense that focuses light in the time domain in the way that traditional optical lenses focus light in a spatial domain.

      It's worse than hyped-up marketing; it's a non-scientist who doesn't understand what he's talking about sprea

  • Deceptive Name (Score:5, Informative)

    by Anonymous Coward on Monday September 28, 2009 @03:55PM (#29572093)

    I'm used to these physics guys doing all kinds of crazy things with invisibility cloaks and such so I took the title to be a literal time lense.

    After RTFA, the "time lense" is a frequency up-shifter. Still impressive, but not supernatural as I had hoped.

  • "A time lens can focus a chunk of time to a point,"

    Since einstein we really know that space and time is the same thing, we really should just call it "squishing space", since time is really a measurement of a distribution of matter and energy, we've compressed the space (and hence the time).

    "Time and space and gravitation have no separate existence from matter. ... Physical objects are not in space, but these objects are spatially extended. In this way the concept 'empty space' loses its meaning. ... Since

  • by BlueBoxSW.com (745855) on Monday September 28, 2009 @04:00PM (#29572193) Homepage

    .. I should know since I read them 70 picoseconds ago using my time telescope.

  • WTH? (Score:4, Interesting)

    by Ancient_Hacker (751168) on Monday September 28, 2009 @04:02PM (#29572219)

    Moving pulses through time has been done with electronic delay lines for about 80 years now. The theory and technology are well worked out, both in the time and frequency/phase domain. A friend of mine worked out an alternate theory around 1961, which left the theorists scratching their heads--- how could there be TWO optimum but different ways of squishing pulses? But it was true.

    Anyway, you don't hear much about this technology as it's not a panacea of any sort. Any information you squeeze in time is going to undergo some unavoidable phase distortion-- not anything you want a lot of. And the inverse operation at the other end adds even more distortion. Yep, no free lunch, once again.

  • That means communications companies will soon be able to bring us 1000+ channels of infomercials and the same sports events for just $60 more per month, while at the same time capping our broadband usage at 2GB a month.

  • by Jason Pollock (45537) on Monday September 28, 2009 @04:14PM (#29572367) Homepage

    It's shifting the frequency into a shorter wavelength, without going through a chip.

    From the article:

        The Cornell team made their time lenses using a silicon waveguide that can channel light. An information-carrying pulse made from a series of
        small laser bursts signalling digital 1s and 0s travels through an optical fibre and into the waveguide. As it enters, it is combined with another
        laser pulse from an infrared laser. The infrared pulse vibrates the atoms of the waveguide, which in turn shifts the frequencies of the
        data-carrying pulse before it exits the waveguide and passes into an optical fibre beyond.

  • Ironic (Score:5, Funny)

    by kitezh (1442937) on Monday September 28, 2009 @04:15PM (#29572373)
    When I logged in, I was greeted with "Did you know subscribers can see articles in the future?"
    • by smoker2 (750216)
      That's ironic because non-subscribers can see the articles in the present, usually sometime before the /. future takes place.
  • MUX? (Score:4, Informative)

    by HaeMaker (221642) on Monday September 28, 2009 @04:17PM (#29572395) Homepage
    The abstract of the actual article is a little more informative [nature.com], but still makes strange claims. I think they can compress a 10Ghz electrical signal into a 270GHz optical signal, with obvious ramifications in multiplexing, as you can then take 27 such signals at a time (theoretically).
    • Re: (Score:3, Interesting)

      by 32771 (906153)

      The following seems a little better:
      http://nanophotonics.ece.cornell.edu/Publications/High-resolution%20spectroscopy%20using%20a%20frequency%20magnifier.pdf [cornell.edu]

      Don't ask me to explain it, I'm still searching for an easier explanation. If you have any contemporary optics knowledge you should be able to figure it out.

    • The abstract of the actual article is a little more informative, but still makes strange claims. I think they can compress a 10Ghz electrical signal into a 270GHz optical signal, with obvious ramifications in multiplexing, as you can then take 27 such signals at a time (theoretically).

      I'm no communications engineer, but I think there's a guy named Shannon who's gonna take issue with some of the claims attached to this story.

      • by 32771 (906153)

        Why should he, the system is exchanging time for bandwidth. You put a snippet of a signal 10GHz wide into the device out comes a signal 1/27th the lenght at 270GHz. If you have an 27 element array of devices that are properly scheduled you can stitch the whole bunch of snippets together at the output just not in frequency but in time. Doesn't really make a difference, you could have used WDM if you just wanted to fill the bandwidth, but there might be other reasons for doing this.

  • by Cytlid (95255) on Monday September 28, 2009 @04:27PM (#29572503)

    I can think of a myriad of uses ..|||..|.||. eady using it for that.

  • by viking80 (697716) on Monday September 28, 2009 @04:43PM (#29572699) Journal

    This is a complete oversell on a normal everyday phenomenon. This is a simple compression of a lightpulse, and has been done for a long time. Dispersion usually smears out a pulse, but can easily, compress the pulse. There is no "bending of time" here. Look up "Chirped pulse amplification" and also "Prism compressor", and maybe "soliton". First descibed in 1834 by John Scott Russell

    • Re: (Score:3, Informative)

      by vlm (69642)

      No kidding. Its such journalist speak I couldn't figure out what it was talking about.

      I think the journalist might have been trying to explain group velocity dispersion aka chromatic dispersion. In a nutshell the speed of light in a vacuum is constant, but in any material it varies a wee tiny bit by frequency, and there is no such thing as a truely monochromatic light source, although we can get pretty close. Work arounds for that problem are VERY OLD NEWS but journalists are always so gullible...

      http:// [wikipedia.org]

    • by kmac06 (608921) on Monday September 28, 2009 @06:03PM (#29573565)
      This is not at all an oversell (though admittedly bad journalism). It's not the same as chirped pulse amplification or prism compression.

      In this case, you start out with an essentially monochromatic long pulse, whose intensity is modulated very slowly compared to the frequency of the light, but as fast as possible using typical telecom electrical modulators. A monochromatic pulse cannot be compressed using a grating or prism. Then the wavelength of the pulse is shifted, with the amount shifted depending on the relative position in the pulse (this is the "time-domain lens"). What you have now is similar to a chirped pulse, which is compressed using a long fiber (I don't know why they don't use prism compression or something else faster here). The time-domain lensing is then undone, "de-chirping" the pulse, leaving you with a much shorter essentially monochromatic pulse at the starting wavelength, with the same amplitude modulation (i.e., carrying the same information).

      The point being a huge increase in the amount of information that can be carried in a fiber.
      • Re: (Score:2, Redundant)

        As others have noted, it's likely there's a fundamental Shannon limit in there somewhere.

        It's not impossible but very likely that this scheme will work, but you lose a proportionate amount of something else. Yes, NO FREE LUNCH. Likely candidates are that the Signal/noise ratio will suffer by the same factor as the alleged speed increase, or the phase margins will degrade by the same amount.

        Again this kind of thing has been studied to death by Bell Labs. See their research journals-- probably 5000 pages th

        • by kmac06 (608921)
          Yes, there is of course a limit to what this can do, I didn't mean to imply otherwise. From the Nature Photonics article:

          As seen in Fig. 4, features in the original waveform as short as 40 ps are transferred to the compressed waveform, resulting in a minimal compressed feature size of 1.5 ps. This indicates that the bandwidth of the compressed waveforms is limited primarily by the electro-optic modulator bandwidth and not by the temporal telescope, as expected from the >600-GHz bandwidth pump pulses used for the FWM time lenses.

          Also, if you look at the plots they have, it's clear that the structure of the incoming beam is very well recreated. Any Nature journal is a top-notch research journal, they aren't going to publish anything that could be found in an old Bell Labs journal.

        • by 4D6963 (933028)
          It's irrelevant, all that matters is that you accommodate for enough bandwidth for the final signal, naturally. The trick here is to overcome the limitations in rate of regular carrier modulation. No big theoretical problem, only the technical issues that nothing can modulate at those rates. In other words that's a neat trick to serialise a signal into faster rates.
        • Re: (Score:3, Informative)

          by ceoyoyo (59147)

          We're so bad at modulating optical signals that we don't come anywhere near the Shannon limits of the channel. From the sound of it, the 27x increase provided (so far) by this technique also doesn't come anywhere close.

          And no, you don't get an article in Nature for regurgitating stuff from old Bell Labs journals.

      • by viking80 (697716)

        I do not think the compressed pulse can be transmitted any length in any fiber. Since the different parts of the pulse have different wavelength, dispersion would destroy the signal fast. Except maybe for vacuum. Traffic at different wavelength have many nonlinear interactions that are amplified if the signals travel all at one speed. Zero dispersion fibers (which is that only at one f anyway) have been abandoned for dispersive fibers to accommodate multicolor traffic.

        • by kmac06 (608921)
          You're half right. The chirped pulse, which is broadband, would undergo dispersion in a fiber (it is in fact this dispersion which initially compresses the beam). After this compression, the beam is de-chirped, so it is again narrowband.
      • by 4D6963 (933028)

        Wow, I actually got it! I feel privileged. I think an algorithmic equivalent could be done by taking the slow modulated carrier's analytic signal, modulating it with a complex chirp, convolving it with another chirp as to make the upper end of the chirp get closer in time to the lower end, modulating it again with a complex chirp that would flatten it and there you go. Correct?

      • by jvkjvk (102057)

        This is not at all an oversell (though admittedly bad journalism).

        "A time lens is essentially like an optical lens," says Foster. An optical lens can deflect a light beam into a much smaller area of space; a time lens deflects..."

        Yes, it is an oversell. They are not deflecting time at all or compressing time at all.

        I don't care that is is not chirped pusle amp or whatever. The fact is that the team itself is describing this as a 'time lens'; and that is blatantly false.

        Regards.

  • by Alzheimers (467217) on Monday September 28, 2009 @05:17PM (#29573105)

    What happens when you take four Time Lenses and align them to be 90 degree angles to each other?

    ONE MAN KNOWS THE TRUTH! [timecube.com]

    • .....what....the....fuck.....?

      .....I'm baffled......who let him near a computer, on the internet, or even out of a womb?

  • Where's my goddamn flying car???

  • IANAP but based on reading temporal magnifier article [cornell.edu] the conclusion is that this is cool stuff that is intentionally being massively distorted by New Scientist in a UFO-craze way. Totally turns me off that mag now.

    The paper is relatively readable even if you are not a physicist. Basically there is nothing spooky going on as New Scientist would say. The researchers developed a way to greatly stretch a short signal to a long one, is all.

    The problem as stated in the Nature abstract, is that the science of pho

  • This is an interesting idea, but they note that you will need to decompress the stream at the other end. This means that unless you can multiplex the light and have multiple compressed streams sharing the same channel, you won't see any performance improvement. You are still limited to transmitting/receiving at a fixed rate; its just that the bits take a shorter time to transit the pipe.

    Are there any losses that are proportional to the time a light pulse spends in a fiber? I'm pretty sure its just relat
  • So they've discovered WinZip for laser pulses? That's what this sounds like to me... 2.5 nanosecond / 24 bit laser pulse goes in, 92 picosecond / 24 bit laser pulse comes out, with the same information encoded. It's lossless compression, basically.

10.0 times 0.1 is hardly ever 1.0.

Working...