Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Networking Power Science Technology

New Computer Powered By PoE 354

BlakeCaldwell writes "BBC News is reporting about a new PC that's powered via a network cable rather than through a wall socket. The computer only requires 12 watts, lower than the upper limit of 15.4 watts that power over ethernet (PoE) can supply. FTA: 'PoE could end up being a universal power supply system as the cables and connectors for it are the same all over the world. By contrast power sockets and plugs differ by country.'"
This discussion has been archived. No new comments can be posted.

New Computer Powered By PoE

Comments Filter:
  • Almost Brilliant (Score:4, Insightful)

    by AKAImBatman ( 238306 ) * <akaimbatman@gmaYEATSil.com minus poet> on Friday April 29, 2005 @10:19AM (#12383076) Homepage Journal
    I was thinking that this had to be one of the most brilliant ideas ever, right up until I realized that users are moving toward WIFI for connectivity. If this had srrived two to three years ago, we might all be using it now. But at this juncture? Likely to be ignored. :-/
  • No GigE support (Score:2, Insightful)

    by nd ( 20186 ) <nacase AT gmail DOT com> on Friday April 29, 2005 @10:24AM (#12383150) Homepage
    This works using the "unused" lines of CAT5. Sure, they're unused for 10/100 ethernet, but this will be much less useful once everyone is using Gigabit ethernet (which uses all 4 pairs).
  • by Tree131 ( 643930 ) on Friday April 29, 2005 @10:25AM (#12383164)
    Not everyone has PoE at home, so this solution would only be ideal for businesses. You can of course always get a PoE cable that will plug in to the wall socket through a transformer and the ethernet jack, thereby combining the power, however that defeats the purpose of PoE, because most devices out there support both 110V and 220V, and they all have universal connectors and power supplies capable of handling whatever voltage you throw at them. An you'll still be plugging it into an electrical socket. You will also need a helluva lot more power to run processor intensive apps, so this would pretty much limit this machine to secretaries and web surfers/majority of home users - see above on why this is not a solution for home.
  • wiring mistakes (Score:1, Insightful)

    by bigmo ( 181402 ) on Friday April 29, 2005 @10:27AM (#12383198)
    can be a very bad thing with power over ethernet. I suppose most equipment would be ok with power on the wrong pins since it's probably 5v anyway. However, some equipment (such as cat5 audio/video distribution boxes) aren't usually made to handle power at the wrong place or even worse, if it's at 12v. For the correct app, it's really convenient, but I think "universal power supply" is a little optimistic.
  • by terraformer ( 617565 ) <tpb@pervici.com> on Friday April 29, 2005 @10:28AM (#12383201) Journal
    I agree totally, but with one caveat. I work in energy efficiency and specifically that of computers. Business and Enterprise continue to use 10baseT and show no signs of changing that for their desktops (not saying they are not using WiFi...) and a business with 10K pcs spends hundreds of thousands to as much as a million dollars on energy a year for PCs (including monitor). What this eliminates is a power supply per pc and the attendant overhead. Consolidating the power supplies groups of computers (power supplies/transformers have efficiency issues depending on load). Also, this forces them to build a desktop with the usage profile of a highly efficient laptop to get under the 15.4 watt limit. The cost savings of using this technology could be very attractive to business. The WiFi concern is one in home and small business networks primarily.
  • by Speare ( 84249 ) on Friday April 29, 2005 @10:37AM (#12383318) Homepage Journal
    Uh, fewer cables and redundant AC/DC converters (wall warts)? Why does every single device need to have a heavy power-processing unit to do the same task of AC/DC conversion? Do it once and make many devices share the low-voltage supply.
  • Nice idea, but... (Score:3, Insightful)

    by Cyn ( 50070 ) <cyn.cyn@org> on Friday April 29, 2005 @10:39AM (#12383349) Homepage
    an ethernet plug is a lot more fragile and prone to 'not snapping in properly' than your average power plug. If some critical control system is powered properly, and disappears from the network, you plug it back in. If it was getting power over that same cable, it now has to boot back up, reinitialize, and figure out where it left off.

    Don't get me wrong, it's a nice thought - but personally I've run into a fair variety of RJ45 jacks. Maybe this would finally snub out those people making the shitty ones, so I'm all for that.
  • The REAL solution (Score:3, Insightful)

    by zakezuke ( 229119 ) on Friday April 29, 2005 @10:41AM (#12383367)
    Establish a GLOBAL standard for power and just go with it. Why not just 12V DC, the already established standard for autos. PoE is such a mickey mouse solution as others have already pointed out will likely confuse people. Pick a plug... anything in the 10mm size should be just dandy.

    Perhaps someone who has wired their house for low voltage would share their solutions. IIRC you couldn't have low and high voltage in the same gang box according to the NEC (National Electrical Code - USA), which is unfortunate as that would be the obvious way to get wall current and convert it to low voltage which is apparently a NO NO.
  • by sysadmn ( 29788 ) <{sysadmn} {at} {gmail.com}> on Friday April 29, 2005 @10:41AM (#12383370) Homepage
    for the vendor. What this overlooks is that there is a reason designers select proprietary power and data cable connections. It gives that vendor a head start in selling you all the other useful things that plug into that port. The worst offenders are cell phone and pda makers. Notebook vendors are almost as bad. Commodity players might have a reason to adopt a standard to drive costs down, but lots of others do not.
  • by adzoox ( 615327 ) * on Friday April 29, 2005 @10:45AM (#12383402) Journal
    I've always liked the iPod ACs that used firewire cables to charge the iPod & thought Apple (to save money and promote firewire) should standardize all their ACs to this spec and same look.
  • by AKAImBatman ( 238306 ) * <akaimbatman@gmaYEATSil.com minus poet> on Friday April 29, 2005 @10:51AM (#12383474) Homepage Journal
    I'm kind of going back and forth on this in my head. On one hand, reducing power to 12 watts sounds like a good thing. On another hand, a modern Pentium processor chews through way more than that on a medium load. Would the reduction is system performance be acceptable to the cost savings? Well, let's do some calcs. Let's assume that a modern PC with a CRT takes a constant 100 watts. (On the high side, I know.) Let's figure that out across 30 days:

    30 * 24 * 100 / 1000 = 72 KWh/month

    At the rates in California (some of the highest in the nation) we get a cost per computer at:

    72 * $0.096 = $6.912

    For 10,000 computers, that comes to:

    10,000 * $6.912 = $69,120

    Now let's say that we use PoE and get the computers down to 12 watts. Some of the energy is lost in transmission, so we'll say that we consume 20 watts per computer:

    30 * 24 * 20 / 1000 = 14.4 KWh/month
    14.4 * $0.096 = $1.382 per computer
    10,000 * $1.382 = $13,820

    Those look like pretty nice savings, but are they actually sufficient to warrant the switch over to a slower machine? In a company with 10,000 PCs, the difference works out to the cost of a few employess on staff. Not pocket change, but not massive savings either. But what if you just go around and replace all the CRTs with LCDs? Well, then the gap probably won't look as interesting, and employers are doubtful to worry about the savings of further power reductions.
  • PoE is a kludge! (Score:5, Insightful)

    by wowbagger ( 69688 ) on Friday April 29, 2005 @10:55AM (#12383529) Homepage Journal
    PoE is just another kludge being standardized because the industry is too lazy and stupid to define a proper standard.

    Ethernet cables were designed to carry DATA, not power. Running a 12W computer off PoE with any kind of distance to the power providing hub is going to require about 20W of input to make it work - with the 8W difference going to heat the cables.

    With all the concern over the leakage current of wall warts, this is an improvement?

    Consider the history of bad decisions like this:
    • "Power Points" in cars. Lighter sockets were designed for lighters, not laptops. They have poor mechanical retention (because the lighter needs to be able to pop out when hot), high contact resistance (so what if the contacts get hot? They are SUPPOSED to get hot!), and a really nasty failure mode (Lil' Billy dropping a penny in them while he waits for mommy to get out of the store). But rather than defining a sensible power connection, the automobile industry lazily continue to push lighter sockets as a power point.
    • USB port powered devices which provide no USB functionality. USB Humidifiers? [thinkgeek.com] Cup Warmers? [thinkgeek.com] Christmat trees? [thinkgeek.com] Ash trays? [thinkgeek.com] Cell phone chargers? [calcellular.com] USB was designed to allow your computer to *control* things, not act as a glorified wall-wart!

    Now we have this stupid idea. "But Ethernet is standard world-wide, and power jacks aren't!"

    So? How about coming up with a standard power/data services jack and deploying it? It's not like Ethernet jacks were a natural phenominon - they were a standard which was created and deployed.

    A nice standard power/data jack, with a standardized supply voltage high enough to move a reasonable amount of power through reasonably sized wires, and a data services jack designed to *move data* would be so much nicer in the end.

    Also, consider this: You have your plant with a bunch of these PoE computer terminals, each tapping power from your central hub. Each computer will inject a small amount of noise onto the line - that's just a fact of life. How much will that noise start to degrade the network signal - especially when you start talking about gigabit Ethernet?

    What if we just standardize on, say, a pair of Anderson Power Pole [andersonpower.com] connectors supplying 24VDC at 2A max, right under a standard RJ-45 Ethernet jack. Devices which want to pull power and data have a combined plug which mates to both sets of connectors, standard Ethernet devices use the top port only. Standardize on using 14 gauge wire for power.

    Now you have a sensible standard power port that can be used internationally, still requires the user to just plug one thing in, and isn't a kludge!

    (O.T. What is with /. suddenly deciding to replace </li> elements with </li><li> ? It screws up making proper HTML lists!)
  • by mrand ( 147739 ) on Friday April 29, 2005 @11:02AM (#12383604)
    While neat in theory, and useful in certain applications, in general there are a few practical problems with making "many devices share the low-voltage supply":

    1. Current flow goes up as voltage goes down (to get the same number of Watts). You don't want to be transmitting a high DC current because series resistance will eat your lunch: Current * Resistance = Voltage drop (aka V=IR, aka Ohm's law).

    2. Following on #1, all the devices sharing one supply need to be relatively close to it.

    3. Even for low current applications, different devices need different and sometimes multiple voltage rails. Do you supply them all, or just some of them make the target derive the others?

    4. Following on both #1 and #3, DC voltage and more importantly, power requirements change over time, so in the end, you'd likely end up with what you have now... multiple DC supplies, some for older devices, some for newer devices.

    Now, a number of these problems could be avoided if you used a high enough DC voltage (let's say 48V), but now you have a safety issue if high currents can be delivered, not to mention that each device would need to step down the 48V - so you end up with the same thing you have now.
  • by SorcererX ( 818515 ) on Friday April 29, 2005 @11:03AM (#12383613) Homepage
    well, first of all, the loss of power is much greater with voltages like 12 V than 230 V and so on, besides I prefer to have 1-2 amps to my computer instead of 20.
  • by mpe ( 36238 ) on Friday April 29, 2005 @12:08PM (#12384376)
    Ethernet cables were designed to carry DATA, not power.

    Using the same cable to carry both data and power has been going on for a century.

    Running a 12W computer off PoE with any kind of distance to the power providing hub is going to require about 20W of input to make it work - with the 8W difference going to heat the cables.

    This isn't a problem with telephone cables. Which tend to be both longer and of poorer overall quality than network cables. Ethernet has a maximum length of 100 metres as opposed to several kilometres for unrepeated telephone circuits.
  • by juanfe ( 466699 ) on Friday April 29, 2005 @02:28PM (#12386088) Homepage
    Yes!
    Let the UN general assembly do it!
    No, wait, maybe it should be the ITU!
    No, wait, maybe it should be the ISO!
    Hmm... maybe the International electrotechnical commission?

    Oh, wait... the US doesn't like standards-setting bodies. OR international organizations, for that matter.

    It's better to have a hodgepodge of cell phone technologies [about.com] that don't talk to each other, a silly measurement system based on bodyparts and british wheat [hypertextbook.com], a TV broadcast system [uk.com] that never twice gives you the same color [aroundcny.com], never mind a digital TV standard that the rest of the world won't use [cybercollege.com].

    I'm sure Bolton will take care of it once he's in the UN as our ambassador. Yeah, that's the ticket...

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...