Is PoE more efficient than plugging in adapters for each network device?

And at what scale does it start to matter?

From my perspective; I’m going for a 3 node mesh router, plus 2 switches, and was considering if in 5 years time the electricity difference would be less than the extra upfront cost. The absolute max length of cable would probably be around 30m

  • MangoPenguin@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 year ago

    My thoughts are:

    • With PoE you’re doing 2 conversions which could waste more power, AC to 48V at the switch, and then 48V down to whatever the device needs with it’s internal buck converter. You also have slightly more losses on the longer run of low voltage 48V DC through ethernet, vs AC.

    On the other side of things:

    • With PoE you only have 1 AC-DC conversion happening, every wall wart power adapter has an idle power draw even without a load attached to it. With PoE you just have the single switch power supply wasting power.

    Overall I doubt the difference will be large enough to matter, and some PoE switches are quite power hungry even with nothing plugged in for some reason, so could end up costing more.

    • gramathy@lemmy.ml
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      Dc-Dc is pretty efficient, I wouldn’t worry about conversion after the initial 48v, but I would potentially worry about losses in poor quality home wiring on longer runs in bigger homes

  • swicano@kbin.social
    link
    fedilink
    arrow-up
    5
    ·
    1 year ago

    I don’t think I would trust anyone here’s answer. The only way to know is to to test it. Theoretical talk about ‘more conversions’ is kinda discounting the entire field of power supply design. We need someone to slap a killawatt on a system using PoE, and then do it again on that system using external adapters.
    I tried Googling to see if anyone had done that and didn’t see anyone doing real testing (on the first page of google at least).
    I do have these findings to report: 1) PoE is marketed as cost saving, largely on the install and maintenance costs: fewer cable runs for weird AP locations, less electrical work, etc. Which means we cannot assume that if PoE is in wide usage, that it is due to electricity cost savings. And 2) increasing efficiency of newer PoE power supplies is an active area of development, meaning that a particularly old set of PoE hardware might be less efficient than expected.

  • litchralee@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    As other posters have remarked, it’s difficult to offer a generalized statement on PoE efficiency. One thing I will point out that hasn’t been mentioned yet is that PoE switches tend to have poor “wall to device” efficiency when lightly loaded. Certifications like 80 Plus only assess efficiency at specific loading levels.

    Hypothetically, a 400W PoE switch serving only a 5W load may cause an additional 10W to be drawn from the wall, which is pretty horrific. But if serving loads totalling 350 W, could draw 390 W from the wall, which might be acceptable efficiency.

    Your best bet is to test your specific configuration and see how the wall efficiency looks, with something like a Kill-o-Watt. Note that even a change from 120V input voltage to 240V input voltage can affect your efficiency results.

  • poVoq@slrpnk.net
    link
    fedilink
    arrow-up
    4
    ·
    1 year ago

    Probably not, but it is nice to still have WiFi in the house when power cuts, as you can easily run the PoE switch on a UPS.

  • Max-P@lemmy.max-p.me
    link
    fedilink
    arrow-up
    4
    arrow-down
    2
    ·
    1 year ago

    I’ll add, it also depends on the efficiency of the local power supplies if those devices were using wall warts. Those are often pretty generic, and may only be used at 25% which for some wall warts would be outside of their top efficiency curve. A single power supply in the form of PoE can be more efficient if it lets both the switch and PoE regulator on the device operate at a better efficiency point.

    In some way, stepping down 48V DC down to 3.3/5V is a bit easier than stepping down the 168V that results from rectifying 120V AC to DC. But the wart could be stepping down the 120V to 5V first with a simple AC transformer which are nearly always more efficient (95%+) than a DC/DC buck converter, but those can still reach 90% efficiency as well.

    In terms of cabling, power loss is a function of current and length (resistance). AC is nice because we can step it up easily and efficiently to extremely high voltages as to minimize the current flowing through the wire, and then step it back down to a manageable voltage. In that way, american 120V has more loss than rest of the world 240V, although it only matters for higher power devices. That also means that the location of the stepping down matters: if you’re gonna run 30m of ethernet and a parallel run of 30m of 5V power, there will be more loss than if you just ran PoE. But again, you need to account the efficiency of the system as a whole. Maybe you’d have a wart that’s 5% more efficient, but you lose that 5% in the cable and it’s a wash. Maybe the wart is super efficient and it’s still way better. Maybe the switch is more efficient.

    It’s going to be highly implementation dependent in how well tuned all the power supplies are across the whole system. You’d need either the exact specs you’ll run, or measure both options and see which has the least power usage.

    I would just run PoE for the convenience of not having to also have an outlet near the device, especially APs which typically work best installed on ceilings. Technically if you run the heat at all during the winter, the loss from the power supplies will contribute to your heating ever so slightly, but will also work against your AC during summers. In the end, I’d still expect the losses to amount to pennies or at best a few dollars. It may end up more expensive just in wiring if some devices are far from an outlet.

  • Boring@lemmy.ml
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    Surley there is a switch out there that can detect voltage used by the device and only deliver that amount.

  • ChaoticNeutralCzech@feddit.de
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    1 year ago

    Depends on the voltage.

    Most IT devices’ key components run at 3-5 V nowadays and the voltage needs to be converted down if the PoE voltage is higher. This introduces losses, especially if a linear regulator is used as opposed to a buck converter. On the other hand, one powerful PSU is slightly more efficient than a lot of smaller ones.

    I don’t think there is a huge difference. If you run a lot of tiny devices (too small/cheap to use a buck converter) off a significantly higher voltage like 24 V or 48 V and/or the cabling is very long, PoE will be less efficient. If the PoE voltage matches the devices’ adapters’ voltage and the cables are reasonably short (<30 m) with few connectors, PoE may be more efficient.

  • antlion@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    2
    arrow-down
    1
    ·
    1 year ago

    Considering the PoE switch is also powered by an adapter, I’d say no. Your have the same efficiency from the AC-DC conversion plus line losses.

  • slazer2au@lemmy.world
    link
    fedilink
    arrow-up
    2
    arrow-down
    1
    ·
    1 year ago

    It depends if your POE device can communicate back to the switch to lower the power output to match its requirements.

    A switch will generally push the full power over the wire unless the remote device can talk back with LLDP power management TLV to lower the power, while connecting with a wall wart the device will pull only what it needs.

    • MangoPenguin@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      1 year ago

      Devices pull power, it doesn’t matter how much the PoE switch supplies, as each device will only pull what it needs.