New Laser Data Transfer Rate Record Set At 26 Tbps 127
MasterPatricko writes "Scientists at the Karlsruhe Institute of Technology (KIT), Germany have published a technique to push optical data transfer rates to new levels. The article says, 'The trick is to use what is known as a "fast Fourier transform" to unpick more than 300 separate colours of light in a laser beam, each encoded with its own string of information.'"
I was going to make a "Library of Congress" joke.. (Score:2, Informative)
At those speeds, the entire Library of Congress collections could be sent down an optical fibre in 10 seconds.
Well played, BBC. Well played, indeed...
Re:Makes Sense... (Score:3, Informative)
So that one episode in Voyager where Seven of Nine makes a "Fourier Analysis" wasn't total bullshit?
No, we've been doing Fourier analysis for decades. Fourier himself invented it in the 19th century.
Regarding the technique, it sounds like an optical-computing implementation of OFDM (orthogonal frequency division multiplexing) [wikipedia.org], which is a core technology for ADSL and many other communications protocols. Electronic and radio OFDM is limited by peaking factor (the ratio of peak level to RMS level); doing this optically may get around this problem.
Re:Ok... let me try and translate (Score:5, Informative)
I'm afraid you're mixing up frequency/wavelength modes with propagation modes. Most long-distance systems use several different wavelengths, that's what WDM is. But they use single-mode fibers, meaning that light at a given optical frequency (and polarization) can only propagate in a single way, thus at a given speed. Multi-mode fibers, with a wider core, let light propagate over different modes (different possible paths in the core for light rays, kind of), which plays havoc with the signal (pulses get echoes and whatnot), which is why they are used only for short distances.
The experiment described here uses OFDM, which in principle is akin to WDM but squeezing many wavelengths as close together as theoretically possible, too close to be separated by classical optical filters. Instead, you can separate them mathematically using an FFT, but that takes a lot of computing power. What the authors did is to implement FFT optically, which is very neat. It enables the use of OFDM at ultrahigh bit rates; and the details of OFDM are such that, used in the right way, it can be extremely resistant to signal degradation (look e.g. at Figure 4(c) in the Nature Photonics article, and think about how tightly a conventional system at that bit rate would have to manage dispersion).
What bugs me is that they describe their setup as performing better than plain coherent detection (Figure 5), which I have a hard time believing. Exactly how did they do the comparison, I wonder.