Now you're over-thinking this. Within their length limitations copper cables can have very low bit error rates (BER). Typical 10GbE copper BER specs are 10**-15. Also, error detection on digital interconnects uses a scheme called Cyclic Redundancy Check, or CRC. Without going into a long lecture on how CRCs work, the salient point (sorry ADTG, I had to use your favorite word) is that the CRC state machines are engaged to check every data chunk transmitted. Any correction logic doesn't use significantly more power. What does use more power in networking is retransmission due to frame or packet drops, but HDMI doesn't support retransmission as far as I can tell. So your point about greater heat generation doesn't apply.\n\nI've been trying to find an HDMI 2.1 specification, haven't yet, but I don't think HDMI provides any error correction mechanism, like ECC. I suspect all that happens with an HDMI transmission error of any kind is a drop-out. So it's purely a video\/audio qualitative consideration.