esseph 5 days ago

Have you ever ran a DC voltage calculation for voltage drop for a cat5/6/7 cable?

It can be substantial. But yes, there are cable spec requirements for POE depending on the demands of the device!

NEC as of 2017 has new standards and a whole section for PoE devices above 60W now, specifically a section on safety and best practices. It DOES have cable requirements that do impact the cable standard chosen.

More info on that here: https://www.panduit.com/content/dam/panduit/en/landing-pages...

From: https://reolink.com/blog/poe-distance-limit/?srsltid=AfmBOop... --- PoE Distance Limit (802.3af)

The original 802.3af PoE standard ratified in 2003 provides up to 15.4W of power to devices. It has a maximum distance limit of 100 meters, like all PoE standards. However, because of voltage drop along Ethernet cables, the usable PoE distance for 15.4W devices is often only 50-60 meters in practice using common Cat5e cabling.

In addition, this piece of note from Wikipedia: https://en.wikipedia.org/wiki/Power_over_Ethernet#Power_capa... ---

The ISO/IEC TR 29125 and Cenelec EN 50174-99-1 draft standards outline the cable bundle temperature rise that can be expected from the use of 4PPoE. A distinction is made between two scenarios:

bundles heating up from the inside to the outside, and bundles heating up from the outside to match the ambient temperature

The second scenario largely depends on the environment and installation, whereas the first is solely influenced by the cable construction. In a standard unshielded cable, the PoE-related temperature rise increases by a factor of 5. In a shielded cable, this value drops to between 2.5 and 3, depending on the design.

PoE+ Distance Limit (802.3at) An update to PoE in 2009 called PoE+ increased the available power to 30W per port. The formal 100-meter distance limit remains unchanged from previous standards. However, the higher power budget of 30W devices leads to increased voltage drops during transmission over long distances.

PoE++ Distance Limit (802.3bt) The latest 2018 PoE++ standard increased available power further to as much as 60W. As you can expect, with higher power outputs, usable distances for PoE++ are even lower than previous PoE versions. Real-world PoE++ distances are often only 15-25 meters for equipment needing the full 60W.

2
wmf 5 days ago

I understand all that but... let's imagine I run 60W 802.3bt over 100m of cat5. The voltage drop will be bad. So what actually happens? Does the device detect voltage droop and shut off? Or does the cable just catch on fire?

leoedin 5 days ago

A longer cable won't just catch fire, because the power dissipation per unit of length is the same regardless of overall length. Imagine the most extreme case - a cable so long the voltage difference is 0V at the end. It's basically just a very long resistor dissipating 60W. But each meter of cable will be dissipating the same power as every other PoE setup.

Looping the cable or putting it in a confined space could cause issues. The cable could then catch fire even though it appeared to be operating normally to the PoE controller.

esseph 5 days ago

In my understanding, it depends on the device and the cable installation. Let's say it's a 59W device so it doesn't fall under NEC regulations as of 2017 for PoE devices over 60W.

The device needs a certain amount of power to keep itself alive. Depending on how the device is designed and if actually adhering to standards, the device should simply not have enough power to start at say 80m, or let's say they pushed the install from the get-go (happens all the time) and it's actually 110m on poor / underspec'd cable.

And let's say the device has enough power to start, but you're using indoor cat5 and it's been outdoor for 7 years, and you don't know this but it's CCA. If it's in a bundle with other similar devices drawing high power and there is enough heat concentrated at a bend, then yes, the cable could catch fire without the device having a problem. As long as the device has enough power it's going to keep doing its thing until the cable has degraded enough to cause signal drop and assuming it's using one of the more modern 4pair PoE standards, would just shut off. But that could be after the drapes or that amazon box in the corner of the room caught fire.

We're just lucky in the residential space that PoE hasn't been as "mass market" as an iphone, and we've been slowly working into higher power delivery as demands have increased.

IMO? It's all silly though. We should just go optical and direct-DC whenever possible ;)

throwaway67743 5 days ago

Depending on switch vendor and quality, they can actually increase the voltage output - the spec iirc allows for up to 57V at PSE which an intelligent switch can modulate to overcome limited voltage drop - cheaper switches (desktop etc) just supply all ports 54V (or less, but it should be 54 at source) from the same rail without any modulation

Aurornis 5 days ago

The specs have a voltage range that the source can put out and a corresponding range of voltages that the device must accept at the end of the wire, after voltage drop.

Longer wires don’t increase the overheating risk because the additional heat is divided over the additional length.

jorvi 5 days ago

I mean.. once you have to start to buy specialized cables anyway, you might as well have specified in the PoE++[0] standard that only specialized PoE++ cables are accepted, verified by device handshake. And then you can engineer the whole thing to basically be "Powerline but with higher data rates", making the cable power-first instead of data-first.

[0]What a horrible naming. PoE 2, 3, 4 etc would have been much better..

esseph 5 days ago

Well, as of 2017 we have -LP marked cables now.

https://www.electricallicenserenewal.com/Electrical-Continui...