Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Science Technology

Integrated Reflector Could Lead to Ubiquitous LEDs 86

Andreas writes "Professor Schubert says he has found a way to raise the efficiency of LEDs to 99%. From an article on Advanced Technology: "Until now, all lighting systems, especially incandescent bulbs, generated more heat than light. But our 99-percent efficient reflectors for LEDs makes them the first candidate for light-bulb replacement that generates more light than heat," said Schubert."
This discussion has been archived. No new comments can be posted.

Integrated Reflector Could Lead to Ubiquitous LEDs

Comments Filter:
  • by rco3 ( 198978 ) on Sunday July 25, 2004 @10:14AM (#9794192) Homepage
    "Professor Schubert says he has found a way to raise the efficiency of LEDs to 99%.
    No, that's not what he says. The reflector is 99% efficient, not the LED. This reflector just means that some of the light emitted by the LED, which otherwise might be absorbed by the LED substrate or other structure and converted to waste heat, is now being reflected back out as usable light.
    This new technology does nothing to improve the quantum efficiency of the LED itself. It's an important and useful technology, sure, but it is NOT a 99% efficient LED.
    • As the man says, a 99% reflector is not a 99% LED but this technology would certainly seem to be a useful advance.

      Some current LEDs already use thin flim techniques and reflectors to collect the rear emitted light and throw it out the front but this is the first I have heard of combining the reflector with the wiring. This might also have the potential of moving heat out of the junction more efficiently would would be a real help in a lot of applications.

      Still I'm curious about how much this will ad
      • by Engineer-Poet ( 795260 ) on Sunday July 25, 2004 @11:04AM (#9794407) Homepage Journal
        Quoth the poster: [slashdot.org]
        There is no point making a bright LED if the total system cost is still ten to twenty times that of an incandescent source.
        As a counterexample, consider the compact-fluorescent bulb. Most cost around ten times as much as an incandescent of equivalent brightness, but the savings in replacement costs and power will pay back the difference in as little as a few months.

        LED technology has the advantage of longer life than fluorescent. With the increase in efficiency from reflectors, they could cut power costs below fluorescent and become the TCO winners.

        • by Elledan ( 582730 ) on Sunday July 25, 2004 @11:27AM (#9794484) Homepage
          "LED technology has the advantage of longer life than fluorescent. With the increase in efficiency from reflectors, they could cut power costs below fluorescent and become the TCO winners."

          Also something to keep in mind is that LEDs are far more robust than fluorescent and incandescent lights. Incandescent lights can't take shocks very well, and the huge temperature delta between an on/off state reduces its lifespan significantly. Fluorescent lights are also relative fragile (ever tried replacing fifty or so of those tubes without shattering at least one of them?), and the ballast used to generate the required high voltage (most types of ballast) create quite a lot of EMI, which is bad for sensitive equipment and cables like Cat-5 etc.

          LEDs generate very little heat, require only a very small current (tens of mA!) at equally low voltages, produce no EMI, are unaffected by all but the most severe shocks and last virtually forever (100,000 hours for red, green, etc. with ease).

          If LEDs are made brighter, even if this makes them more expensive than other technologies, there are always some (less friendly) places where they would work very well and would be cheaper in the long term.
          • colour temperature (Score:3, Interesting)

            by RMH101 ( 636144 )
            but bear in mind the colour temperature of them seems "weird" to the human eye; lighting a room in them isn't very cozy.
            • Flourscent lights also have a "weird" color temperature - but improvements have been made and we got used to them.
            • Cant be any worse than flourescent lamps. I cant stand the things myself and hate having to be under them 8 hours/day.

              Incandecent bulbs bay ultimatly be yellow in color but they *feel* better. In any case I bet there would eventually be LEDs that are capable of emitting full spectrum light, not just white...
          • "tried replacing fifty or so of those tubes without shattering at least one of them"
            Yes. Yesterday in fact.
            Course it helps that I'm tall enough to reach the fittings without a ladder...
        • I'd probably add the fact that fluorescent bulbs are environmentally problematic, and can be dangerous to health. While manufacturing LEDs undoubtedly create waste by-products, I would doubt that they would be much worse than when manufacturing fluorescent bulbs.
        • Ahh, but a compact flourescent is still more efficent than an LED.
          • Fluorescents are perhaps 30% efficient. If reflectors can nearly double the light output of LEDs by using light that would otherwise have been absorbed by the back, the LEDs could beat the efficiency of fluorescents.

            This is not as difficult as it sounds. Fluorescent lights have two conversion stages (electricity to UV in the mercury vapor, UV to visible light in the phosphors) and each conversion has losses. LEDs have only one, which gives the LED an inherent advantage.

            • No, White LEDs have two stages. Electricity to Blue/UV in the chip, UV/Blue to White in the phosphors. Except that the mercury-argon arc is more efficent than the LED chip so far at producing Blue/UV light. In both cases, mind you, the base device does produce a certain amount of visible light as well.

              LEDs are decent for some single-color applications (although cold-cathode "neon sign" tubing often does just as good of a job), low-voltage, and under some mechanical-shock requirements.

              Add to this that L
  • Several years ago I researched this on the web, but never really reached a satisfactory resolution.

    There's a type of traffic signal whose illumination is clearly visible only within a narrow angle. As you approach the intersection, you can see all three *lamps/lenses*, but you don't see which lamp (r/y/g) is currently *lit* until you enter within a certain angle of the lamp.

    Can anyone *authoritatively* explain how this is done in *this* application?
  • 110/230V AC (Score:5, Interesting)

    by EnglishTim ( 9662 ) on Sunday July 25, 2004 @11:34AM (#9794512)
    Okay, so let's assume all our lightbulbs start being made from LEDs... At some point soon we're going to have to start changing our lighting circuits to 5V, or something like that. It's madness that each lightbulb will have to contain it's own little transformer - it'll make the bulbs vastly more expensive and wasteful.

    There are a selection of appliances that work well with 110/230V AC - things that require a lot of power like kettles, hoovers, heaters, washing machines, hobs, tumble driers and the like. However, there's an increasing number of appliances in a modern household that would be much better served by a 12V DC supply.

    How long do you think it'll be before we start changing over?
    • Re:110/230V AC (Score:5, Informative)

      by Smidge204 ( 605297 ) on Sunday July 25, 2004 @12:25PM (#9794803) Journal
      Unfortunately DC power doesn't transmit over any kind of distance very well. AC is much more efficient for that. (Esp. at high voltages... 20,000V+)

      Also, all flourecent lights have transformers in them, so suddenly it's not too unreasonable for each light fixture to have its own little transformer in it!
      =Smidge=
      • Re:110/230V AC (Score:4, Informative)

        by falzer ( 224563 ) on Sunday July 25, 2004 @02:18PM (#9795427)
        > Unfortunately DC power doesn't transmit over any kind of distance very well.

        Looking at just the wire itself, transmission losses aren't worse for DC. There are a few HVDC transmission lines in operation now. Some are used for 50/60Hz conversion.

        The reason AC is used because it's easier/cheaper to efficiently step up (and down) the voltage to useful levels, as per your power transmission example.
        • Re:110/230V AC (Score:2, Informative)

          by HaveNoMouth ( 556104 )
          The reason AC is used because it's easier/cheaper to efficiently step up (and down) the voltage to useful levels, as per your power transmission example.

          Exactly right. The real issue is that transmission lines are not perfect conductors; there is always some small resistance which causes power to be lost in heating up the wire. The heat loss is given by the square of the current times that resistance. Therefore, you want to transmit power with as low a current as possible to minimize the power lost along t

      • by Anonymous Coward
        That's why it's used for the highest-power and longest-distance links, such as the lines from northern Quebec to New York.

        AC lines radiate like antennas. 1/4 wavelength at 60 Hz is 776 miles at c (in practice, reduce by the appropriate velocity factor), so if a transmission line gets long enough, it can radiate a lot of power.

        What AC is good for is *conversion*. For long distances, you want high voltages, and transformers are a simple, reliable, and cheap way to convert from a high transmission voltage
      • If that's the case, then it seems to make sense if everyone just has a central DC transformer in their house. These days, everyone probably owns about 10-20 DC transformers already (every time you buy a phone, answering machine, recharger, computer speakers, or just about any small piece of digital electronics you get one). It's kind of wasteful that we have so many of these things kicking around all the time; why not just have a central one for the entire household?
    • Re:110/230V AC (Score:3, Informative)

      by alienw ( 585907 )
      Running 5V around the whole house would be much more inefficient than putting a transformer inside each bulb. It would also be expensive, and a fire hazard. Think about it: running 1kW of power through a 5V system would require 200 amps of current! That would require welding cable-sized wires.

      • But welding cable sized wires are fun! I've run (theatrical) lighting systems off 0/4 cabling - an accidental short is pretty impressive, tripping the breaker for the building mains. I've learned the hard way not to trust my school's electricians when there is more than 120V/20A involved
    • Re:110/230V AC (Score:4, Informative)

      by bluGill ( 862 ) on Sunday July 25, 2004 @01:01PM (#9795018)

      Electronics 101: when you connect 2 circuits in series each sees half the total voltage. Connect 24 LEDs in series to a 120 volt line and each sees 5 volts! By definition there is no need to transform the AC into DC - the D in LED stands for diode, which is what you use to turn AC into DC! In the real world you are likely to use 48 LEDs, in two different strings, so that you get light from both sides of the wave.

      • Can you say "epileptic seizure"?
        • Can you say "epileptic seizure"?
          Can you say 'persistence of vision'? AFAIK, epileptics do not have a problem with cinemas (24fps/Hz) so why should there be a problem with the 25/30/50/60Hz cycle this setup would give?

          • Movie projectors and TVs cycle at about 30 hertz, but the light output from them is relatively constant even though the image is changing. On the other hand, haven't you ever seen the flicker of a fluorescent bulb? How about your computer monitor? These kinds of flickers are both on the order of 60 hertz, yet you can see them.

            An LED would flicker a thousand times more noticeably because LEDs have a response-time on the order of 0.5 milliseconds. Additionally, an LED in a pure AC circuit couldn't help b
            • If you use a full-wave rectifier (or just use two strings of LEDs), both the positive and negative sides of the cycle will cause the LEDs to light. So you'll actually get them strobing at 120Hz. The eye can't see changes past 100Hz, so no problems there.

              Grab.
      • Electronics 101: when you connect 2 circuits in series each sees half the total voltage.

        Electronics 102: when you connect 2 equal non-reactive loads in series, each drops half the total voltage.

        In the real world you are likely to use 48 LEDs, in two different strings, so that you get light from both sides of the wave.

        In real world, you use a full-wave rectifier so that all 48 LEDs light during both sides of the wave.
        • Even better you throw in a capacitor with the rectifier to smooth the power. Reduces the flicker when you move your eyes and improves LED longevity by lowering the peak power.

          -
          • You can get rid of all those problems easily.
            1. Rectify, filter and chop the incoming AC.
            2. Feed chopped AC into a small transformer which changes it from the filter output voltage to the voltage required by the diode string.
            3. Run this at ~20 KHz so that flicker is invisible.

            If you run the switcher to get a specified level of current through the LED string, you can both vary the brightness to spec and run at any voltage within the capability of the switcher. I don't see a big market for internationalized light

      • In the real world you are likely to use 48 LEDs, in two different strings, so that you get light from both sides of the wave.
        Me thinks using half of them with a rectifier would be more efficient - same amount of light with a bit more than half of the components. (Although the other may actually turn out cheaper, who knows...)
      • the D in LED stands for diode

        Unfortunately, they're not very good diodes, with the reverse (blocking) voltage typically not being much more than the forward voltage. What this means in practical terms is that unless you also have a real diode in series with the LEDs (or a varistor/series-pair of zeners across the LEDs), the first power surge along will kill them.

        Your 120VAC (our 240VAC) is not constant. It goes from zero to root-2 of the voltage. This causes immense problems with movement, particularly m

      • > Connect 24 LEDs in series to a 120 volt line and each sees 5 volts!

        Yes, but only a retard, or a marketing droid, would set them up that way. One goes out - they all go out. Set them up in parallel and you get to see exactly which one died.

        Of course if it's a unit of 24 LEDs that's only replaceable as a unit then that might make sense. Still, you're binning 23 good LEDs which seems a bit of a waste, and 23 LEDs will give nearly as good light as 24 and may even, depending on the application, still b
        • The usual drop on a red LED is about 1.7 volts, so a string of 24 would have a working voltage of about 41 volts (yellow and green have higher forward voltages IIRC). You are going to need a power supply to run this, and it is going to be much easier to generate 41 volts @ 200 mA for a single series string than 6.8 volts @ 1.2 A to run 6 parallel strings of 4. For the hypothetical traffic light the "supply" could be as simple as a diode bridge and a current-limiting resistor; that resistor would run mighty
    • I have had this thought as well. For quite a while I have considered a 12V wiring system, and at this point I have decided that my next house will have one. Nay sayers will tout the dangers of DC, as well as the inefficiency. But what they dont realize is that a well designed system doesnt have to pump a gazillion amps over every line. Any individual outlet doesnt need to provide more than a few amps, since 'normal' =12V applications dont pull much power (cell phone charger, LED light, etc). A well tho
      • While 12V is a popular choice for low-voltage inputs, it's by no means the only one. Your cell phone (or whatever gadgets you want to power) is just as likely to require a 6V or 9V input. How are you going to handle that?
      • Any nay-sayers tout the dangers of DC, they need to consider the stunningly dangerous, should-never-be-allowed hazard of the battery in their car... :-)

        Grab.
      • DC doesn't lose more than AC; in fact just the opposite. However, 12V does of course lose more than 110V. 12V is probably not enough for a useful circuit through the house; anything more than say 20A breakers would be too dangerous and that only gives you 240W. 24V or 48V would probably be better -- you could use cheap and efficient DC-DC "transformers" to get 4/6/9/12V according to what each appliance needs.
    • Re:110/230V AC (Score:4, Informative)

      by SuperBanana ( 662181 ) on Sunday July 25, 2004 @02:00PM (#9795313)
      It's madness that each lightbulb will have to contain it's own little transformer - it'll make the bulbs vastly more expensive and wasteful.

      If you have 2v LEDs, you only need wire about 60 of them in series and you've taken care of the voltage problem. Well, except they'll blink at 60hz, quite strongly...and if one failed, they'd all go out. But in any case, it's hardly rocket science to make use of the higher voltage level, especially since LEDs will tolerate A/C. Incidentally, look at a screw-in fluorescent bulb some time- they've gotten the whole thing down to $10 or so, and that includes a transformer and electronics to raise the voltage. Transformers etc are very cheap.

      However, there's an increasing number of appliances in a modern household that would be much better served by a 12V DC supply. How long do you think it'll be before we start changing over?

      Never. The whole point behind A/C is that it is very easy to step up/down, and as a result, you can use a higher voltage for transmission and distribution. Higher voltages mean less current flow for the same amount of energy, which means reasonably sized wiring and such.

      Even in the short distances involved in a house, losses from wiring can be substantial at such a low voltages as 12v. 48v might be a better choice, but I can't see it ever taking off.

    • Re:110/230V AC (Score:3, Interesting)

      by Peepsalot ( 654517 )

      It's madness that each lightbulb will have to contain it's own little transformer - it'll make the bulbs vastly more expensive and wasteful.

      Sorry, I don't think you'll see a change(at least not to a lower voltage) in your wall outlet's voltage any time soon. Maybe a transformer in every light bulb seems wasteful, but take into consideration that led's can last roughly 100,000 hrs as opposed to 1,000 for incandescant's.

      Still seems wasteful?
      Transformers don't have to be huge, especially if they are power

    • Re:110/230V AC (Score:5, Interesting)

      by rco3 ( 198978 ) on Sunday July 25, 2004 @03:09PM (#9795704) Homepage
      A few points to note, realizing that not many posters around here are EE's:

      1) LED's are not voltage mode devices. There IS a typical voltage drop associated with an LED, but it can vary appreciably between devices. One sets the operational point of an LED by controlling the current through it, and allowing the voltage to settle to whatever value it wants. Typically, one would want to see around 30 mA through a normal T1-3/4 LED. Depending on the chemistry of the LED, this could result in a voltage anywhere between 1.5V and 3 or 4V. This, as I said, will vary somewhat between different LED's of the identical type. If you try to set the voltage, you'll get wildly varying currents and a lot of dead LEDs.

      2) Stringing together LEDs in series to get something approaching 120V drop is a good idea, but you still have to limit the current. Leaving a few volts between the nominal operational voltage of your LED string and the nominal supply voltage is a good idea, because you can then use an active (or passive) current limitation scheme which operates within that voltage gap. The simplest way is with a single resistor, sized such that R= (Vsupply-VLEDS)/ILEDS. This is subject to variation due to device mismatch, temp variation, etc, and dissipates some power in the resistor. Another way to do this, which allows for the LED to be operated from a much higher voltage than it's rated for, is to use a series capacitor. The determination of proper capacitor size is a bit more tricky, but you can successfully run a single LED from a 120V supply. The indicator in my waterbed heater has run this way for a couple of years now. Nice part is that the capacitor does NOT dissipate any power as heat. Enough of them might screw up your power factor enough to piss off the power company, though :-)

      Important messages to take home from this: you can't set LED operating point from the voltage across it, at least not safely and reliably; you can operate LEDs from 120VAC using a capacitor as the gain setting element, which is appreciably cheaper than using a transformer.
      • That's really interesting. Can you elaborate on how the series capacitor works because my school physics education is becoming hazy now?

        Is there any problem using that also on 230V A/C?

        What about reducing flicker?
        • oooh, oooh, I get it!

          Because the LED is in series with the capacitor, the voltage drops, ensuring that there is a differential in the EMFs from the supply and the capacitor, and a current always flows.

          But the supply must be rectified and smoothed first, I suppose. Also solving the flicker.

          Do I get any soup?
          • Rectifying/smoothing is optional. You could simply put a pair of oppositely oriented LED's in parallel then connect the pair in series to the capacitor, end of circuit. Only a very limited current can flow into/out of the very small capacitance, and each phase of the AC flows through one or the other LED. Note that it is a particulary small capacitor to limit total current flow each phase rather than a large one to pass AC current.

            ......+LED-
            AC___/.....\__)capacitor(___AC
            .... \...../
            ......-LED+

            Interesting

            • Exactly correct, I'd forgotten about the second LED in reverse parallel with the first. Been a while since I built that thing!

              I also note that some guys on the web are also using a series R to limit inrush current on startup.
      • Enough of them might screw up your power factor enough to piss off the power company, though

        This only works if you are working with inductive loads like motors and such not resistive loads like what you would find through a led.
        • Ah, but the series capacitor is a capacitive load, which will change your power factor in the direction AWAY from the inducitve loads...
    • I am wondering why you would have to put a transformer in each bulb. Why not just put one in the light fixture itself? Another poster mentioned that all flourescent lightbulbs do have their own transformers, so perhaps there is a good reason the fixture couldn't have its own. Could someone comment on this?
      • Standard flourescent fixtures have their ballasts as part of the fixture.

        The compact florescents designed to be used in incadesent fixtures have their ballast built into the lamp.

      • Why not create a device that plugs into the standard 110v ac incandescant outlet (the standar screw type). The device would have the transformer in it. The bottom half would have some other fixture type (circular 12v dc car outlet). Each led bulb would fit into the secondar fixture. This way if the leds fail (never seen it happen), you don't have to replace a perfectly good transformer.

        I would rather wire a 12v dc circut with special fixtures instead of putting a device in each fixture as transformers are
    • >>However, there's an increasing number of
      >>appliances in a modern household that would be
      >>much better served by a 12V DC supply.

      While it would seem to be a good idea for the power companies to hook up a 12v dc circut to your home, you must remember why the power companies use such high voltages to begin with. It has to do with the resistance of the wires.

      The best way to transmit 100,000 watts of electricity is at 100,000v AC at 1 ampre versus 1v AC at 100,000 ampres. The more ampres yo
    • It's madness that each lightbulb will have to contain it's own little transformer

      You mean like 12v Halogens? They seem popular enough. And remember, it isn't the voltage that causes problems (to a point), it's the power that is dissipated. Ohms law says that Current = Voltage / Resistance. If you increase the voltage, so will the current. When you increase the voltage 2V, the current increases 2I. But the power dissipation increases n. So when you increase the voltage, as long as the current sta
  • Just think what this could mean for online forums in general, and Slashdot in particular! ;>
  • Good news everybody. I just invented a new gadget. An integrated reflector that could lead to ubiquitious LEDs.
  • by doc modulo ( 568776 ) on Sunday July 25, 2004 @03:06PM (#9795691)
    At the moment, projectors are lighted by expensive, proprietary light bulbs.

    Because of the hot bulbs, the projectors are too noisy to enjoy a nice movie night at home and they burn out after a while.

    An array of LEDs would be superiour because they'd be more durable (no need for expensive replacements after X hours) and might be cool enough for fanless beamers.

    Unfortunately the manufacturers use the projectors like razorblade holders or like inkjet printers. You can only fit the replacement bulb that the manufacturer made themselves and the replacement bulbs are very expensive because of that monopoly.

    However, all it takes is ONE monufacturer to produce a good LED beamer to disrupt the current situation. All the others will have to follow if they want customers after LED lightsources take over, the sooner the better.

    I made up my mind to ONLY buy a LED beamer because I know it's possible and I know I'll be screwed over by the current beamers if I don't. The less bulb-beamers we as consumers buy, the faster the changeover will happen.
    • The projector lamps are all standard. Where the manufacturer is gouging you is for the integrated reflector. The style of discharge lamp used in projectors tends to have a single hotspot that needs to get focused to get optimal illumination. Misfocusing the hotspot can lead to most of the light not being reflected into the picture element, or too much heat being reflected into the element, burning out the LCD/DLP/film/whatever. No imaging tech is immune to this.

      For an example of price differences, an X

    • Projector bulb systems typically put out light in the order of 5000-8000 lumens (although cheap ones make do with less). Even high-wattage white LEDs put out nowhere near this amount of light and so you would need a bunch of them together to get the same bright image you are used to with your noisy gas discharge system. The innovation will not increase efficiencies enough in the near-term to change this.

      You won't get rid of the noise either...since LEDs require a low junction temperature to operate eff
      • You have no idea what your talking about. Your better off questioning the veracity of the statement of 99% efficiency. Given that 100% efficiency is somewhere around 220 Lumens/Watt your looking at a tops of 40 watts to generate the light required to exceed your estimate of 8000 lumens. But don't forget that 99% of that energy is being converted into light. The actual heat energy is somewhere in the neighborhood of 1/2 a watt. This easily qualifies for fanless operation. That being said I am skeptical
        • Actually I do... (Score:3, Informative)

          by Pinkoir ( 666130 )
          If you read the article you will see that there is no claim on 99% efficiency of the LED. The claim is 99% efficiency on the reflector. No LED anywhere in the world comes even close to 220 lumens/watt. The best I've seen in the real world is about 80 lumens from the 3 watt Luxeon devices put out by Lumileds. These devices are very hot and need a lot of heat-sinking to avoid destroying themselves in any confined application. You have to remember that LEDs aren't magic. They are just full of inefficienc
          • I am guilty!!! Please direct the comment of knowing what one is talking about to myself and whomever posted this story. I only read the story header, which does claim that they raised the efficiency of leds to 99%. I don't know why I even bother reading slashdot anymore I'm guessing I should spend more of me free time elsewhere.
        • Your == possessive. You're == you are.

          You're awfully inflamatory. The LED is not 99% efficient -- the reflector inside the LED is. Typical white LEDs are well under 40%.
    • An inexpensive array of LED's that could possibly emit enough such light is a fool's hope. Now, solid state RGB lasers that rapidly scan the entire screen CRT-style may soon be feasible...
    • Despite the naysayers here, you could be on to something.

      A DLP projector uses a "white" bulb and reflects the light through red, green, and blue filters. The filters (ideally) are band pass filters, allowing only a fraction of the light to pass, and absorbing the rest, which must be reradiated as heat.

      If our light source was "tuned" to put most of its power out in the bands used by the filters, then a lot less light would wasted. So if you could make your light source from properly tuned and bright re

  • See :

    http://www.google.com/search?hl=en&ie=UTF-8&q=omro n+dr-led&btnG=Google+Search [google.com]
    http://www.omroncomponents.co.uk/Press/DR-LED.pdf [omroncomponents.co.uk]

    The above operates according to the same general idea of recovering light that would otherwise be lost, although in this case the implementation is completely different and much less sophisticated than that referenced in the post.

    The manufacture claims a 2x improvement over conventional LEDs as well. Unfortunately, they seem to have suddenly discontinued the
  • "In an LED, light emits from inside the semiconductor in every direction, but our omnidirectional mirror reflects light equally well no matter what the angle of incidence. Other types of reflectors are only efficient when the angle is normal, or 90 degrees perpendicular to the surface," said Schubert.

    I wonder if this technology could be used to enhance the efficiency of solar collection devices. This reminds me of the way that plants collect and use photons. Any solar engineers out there?

    • I'm not a solar engineer, but I have a suspicion that the power you get depends on the watts/area you get on your solar cells. You could get greater watts/area using this new reflection technology. A bunch of mirrors can concentrate the sunlight with greater efficiency. However, it remains to be seen just how much of a percent increase in watts/area on solar cells this will engender. Given that this is an "omnidirectional mirror", you could have your collector array operating at peack efficiency for a muc
      • An omnidirectional mirror will not magically reflect the light in the direction you need (ie where the solar cells are), so you still need move the mirror during the day. These omnidirectional mirrors wouldn't be an improvement over current technology.
  • But can LEDs be dimmed to get mood lighting? Hmm... I bet all you Slashdotter failed to think of that scenario.. haha...

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...