

New Material Sets Stage For All-Optical Computing 53
An anonymous reader writes with this excerpt from the International Business Times: "Researchers have made a new material that can be used to guide waves of light, a breakthrough that could lead to ultra-fast computing. Georgia Tech scientists are using specially designed organic dyes that can process and redirect light without the need to be converted to electricity first. ... 'For this class of molecules, we can with a high degree of reliability predict where the molecules will have both large optical nonlinearities and low two-photon absorption,' said [Georgia Tech School of Chemistry professor Seth] Marder."
According to the article, using an optical router could lead to transmission speeds as high as 2,000 gigabits per second, five times faster than current technology.
Re: (Score:1, Offtopic)
You mean "Bazinga!"
Didn't see that coming (Score:2)
"2000 gigabits per second"
GigaBITs? Wow!
Re: (Score:2, Insightful)
Re: (Score:2)
so you're saying not only are they bits, but they're power-of-ten giga-?
Re: (Score:1)
Re: (Score:2, Insightful)
Probably has more to do with the fact that historically some hardware had byte and word sizes that weren't multiples of 8.
E.g. see http://en.wikipedia.org/wiki/36-bit [wikipedia.org]
Re: (Score:1)
of course (Score:1)
Re:OS abstraction wtf (Score:1, Informative)
Where did you learn it was an OS abstraction? That's just ... sigh.
Bytes are the smallest addressable unit of memory the CPU can handle. It doesn't matter if the memory controller only does cache line fills or whatever, memory addresses are in units of bytes.
Re: (Score:2)
Just a guess, but i wouldn't imagine packets are measure in bits but in bytes. That's why hex is often used, right? Unless the router is using an exotic protocol, everything will be measured in bytes? Data structures are almost all in multiples of bytes.. size_of(bool) returns 1. malloc works with bytes, not bits. I'm pretty sure you will have a hard time finding anything that works directly in bits and not bytes, to include routers.
Re: (Score:1)
2000 / 1000 = 5
Not to mention that if 1000Gb/s connections are widely commercially available today, I have to assume that faster connections are available for specialized purposes.
Re: (Score:1)
Re: (Score:3, Funny)
Gah I hate these lame random units .. gigabits/second.
Could somebody translate that to a more standard Libararies of Congress/fortnight please.
Re: (Score:2)
I'd say... approximately 27503 LoC/Fortnight
Re: (Score:2)
Re: (Score:2)
Re: (Score:2, Insightful)
Exactly, like lower latency. The conversion into an electrical signal, and then back to optical probably adds a bit of latency. I'm no expert, but I'd imagine that data going to and from a typical destination on the internet goes through several of these conversions adding (in most cases negligible) latency. If most of the routers on the net were all optical, I'd imagine we'd have an internet with imperceptible latency most of the time. That could lead to things as simple as lag-free gaming, real-time vi
Re: (Score:1, Insightful)
Re:But (Score:5, Insightful)
Of course. Because the new technology also is getting better. And usually at a much quicker rate than the existing one, because that one is already at the end of its limits.
There often even is new technology that is still worse than the old one, because of its experimental state. But worth pursuing anyway, because of the huge potential.
The same is true for optical circuits.
Re: (Score:1)
If the best expected performance of the new technology is just 5 times better than current technology, is it really worth pursuing it?
Moore's law says 2× in two years, but some people think we're running up against the limits of Silicon.
This gives us 5× in one shot.
You ask is it worth it? Are you suggesting that we not engage in pure science any more because it might not pan out in the long run? Only time will tell if it was worth it.
Sure (Score:1)
Re:But (Score:5, Insightful)
The first automobiles could easily be outrun by a horse. I guess we're fortunate that no one noticed that or else they would've all agreed that automobile technology was a waste of time and should be abandoned.
Re: (Score:1)
Faster FAster FASTER! (Score:4, Funny)
Have you seen the pic? (Score:2, Funny)
I could not get my eyes from that advertisy pic [ibtimes.com].
Must...read...article.
Re: (Score:2)
http://www.phdcomics.com/comics/archive.php?comicid=1187 [phdcomics.com]
Click on the next ones to see the whole story
Similar stuff from IBM (Score:3, Interesting)
...And the same news from Semiconductor Intl [semiconductor.net].
Re: (Score:2)
Simple. IBM no longer needed help because it invented awesome.
Optocouplers (Score:4, Interesting)
Re:Optocouplers (Score:4, Informative)
optoelectronics
If they don’t have to be converted to electricity first, then where are the electronics in this?
A better name is “photonics”. :)
Re: (Score:2)
Given that the signals arrive in optical form, you (will) have two choices:
1) Convert them into electrical signals, using optoelectronics, process the data, and then (sometimes) convert the signal back to optical.
2) Keep the signals in optical form and process them using these new materials.
Just because the first option has more steps, doesn't mean it's slower. If you have very fast converters, and then very fast transistors (like the graphene-based ones linked above), then y
Re: (Score:2)
optoelectronics
If they don’t have to be converted to electricity first, then where are the electronics in this?
A better name is “photonics”. :)
Well that until your computer is totally processed in light, including CPU, motherboard, display card and network interface. I would agree the future AC transformer be an giant LED light blub, then all component inside are light path...to the extend that the light is processed and redirected to the monitor panel directly and a lighting pixel. That's would be very awesome, yet I don't see it coming very soon though :P. Hence we will still stick with electronics somewhere for a while.
Re: (Score:1)
Even better, Make the optoelectronics in graphene too.
Would love a quick turn around on technologies such as this, even if for a quick novelty factor at first.
Even something as powerful as a 486 would surfice, although the mhz would probably not be an issue.
Go Jackets! (Score:1)
Is it just me? (Score:2)
I've been reading headlines for the past 20 years that claim "breakthroughs" in all-photonic computing. Where are the all-photonic routers?
Re: (Score:1)
Power & Heat (Score:3, Interesting)
This is a result of the highly-clustered, highly-mobile computing age we live in today. A single fast chip isn't as applicable any more. Give us tiny and low-power.
The lights SHOULD dim when you switch it on! (Score:1)
Probably not much to see here, at least yet (Score:3, Interesting)
Heat and tempest security (Score:2)
Obviously the ramifications as far as emissions security (TEMPEST though that's a simplification of TEMPEST) are huge, but what is this likely to do for heating and component size. I can see this being a great opportunity for a lot of military applications even if the speed limit is only a few times better than what we have now.
Fiber (Score:1)
Cloaking Device? (Score:1)
Unfortunately, the article didn't hint at this possibility at all. However, I did pick this up:
The research was funded by the National Science Foundation (NSF), the Defense Advanced Research Projects Agency (DARPA) and the Office of Naval Research (ONR).
So DARPA's helping fund it eh? In answer to my own question then, "Yes!"
Leave it to DARPA to fund the development of a cloaking device and play it off as a computer breakthrough. I, for one, am stoked.
Smoke and mirrors. (Score:1)