What's the Matter with HDMI?

A

admin

Audioholics Robot
Staff member
HDMI, as we've pointed out elsewhere, is a format which was designed primarily to serve the interests of the content-provider industries, not to serve the interests of the consumer. The result is a mess, and in particular, the signal is quite hard to route and switch, cable assemblies are unnecessarily complicated, and distance runs are chancy. Why is this, and what did the designers of the standard do wrong? And what can we do about it? Check out this informative article from our friends at Bluejeans Cable.


Discuss "What's the Matter with HDMI?" here. Read the article.
 
stratman

stratman

Audioholic Ninja
This reminds me of the same stupid mentality used by Detroit to design cars by committee, it ends up being a disaster, remember the Pontiac Aztec? It's the HDMI of cars.
 
gene

gene

Audioholics Master Chief
Administrator
remember the Pontiac Aztec? It's the HDMI of cars
Haha I love the analogy and to this day still floored by people that actually bought that car :)
 
W

westcott

Audioholic General
Kurt is a great guy and very knowledgable. He is also willing to go out of his way to help his customers in any way he can.

Great article and I would love to see him right more for Audioholics. Cable technology can be very interesting and Kurt has taught me a lot.
 
C

CharlyD

Enthusiast
Another issue on HDMI is that it imposes a very constrained and inefficient architecture on entertainment systems. HDMI was created to allow transmission of uncompressed video and audio from a source device (e.g. HD-DVD) to a display device. As described in Kurt's excellent article, this transmission is one-way with no error correction. The only two-way communication is transmission of control signals between devices. It is possible to distribute an HDMI source signal to multiple displays, but switches for this purpose are expensive and complex.

Because the transmission is uncompressed, this requires suitable codecs (e.g. MPEG-2/4) in each source device. It is also mandated that all modern TV's include an ATSC tuner that includes at least an MPEG-2 codec. Of course any codec in the display would be redundant with an HDMI input. I should also mention than an ATSC signal has only a 6MHz bandwidth, far smaller than that required by HDMI.

It sure would have been nice if more thought were given to allowing an entertainment system architecture that better met the needs of consumers prior to the mass adoption of HDMI. Whatever happened to HANA? The High-Defintion Audio-Video Network Alliance proposes a standard for networking CE devices over IEEE-1394 (firewire) that would be far more simple and less expensive than an HDMI implementation.
 
highfihoney

highfihoney

Audioholic Samurai
This reminds me of the same stupid mentality used by Detroit to design cars by committee
Agreed,stupid & costly .

As far as im concerned the whole business of HDMI is nothing more than snake oil & should be given the same negative attention as exotic cable vendors instead of being praised as the latest & greatest.
 
DavidW

DavidW

Audioholics Contributing Writer
My two cents

My own experience recently, in the course of research into HDMI DVD players, I noticed something while looking over user reviews of products that they had purchased.

What struck me was the trend that when people complained of poor player performance, it usually had to do with picture and sound trouble, picture would stall or drop out and the like, not quality issues such as mechanical part failure or apparent build quality.

My own suspicion, supported by what I know of HDMI and all the HDCP/DRM crap built into it and also from what this article says about the poor engineering considerations of the format's performance in favor of all the content protection, is that the problems encountered are not manufacturing defects per se, but rather HDMI incompatibility issues from the lousy standard and loads of software glitches from the copy protection schemes.

Electronics manufacturers have unfortunately bought into this standard that benefits no one but the content providers. The manufacturers are eating the costs of the returned products that are mistaken for having a defect when the only defect is the poor quality of the HDMI standard and the inherent HDCP problems. Which then, of course is passed back to the consumer yet again in higher prices, not only to compensate for additional costs of warranty replacement, but also to pay HDMI Licensing, LLC to licensing their faulty standard who's only purpose is to treat every consumer as a potential criminal, guilty with no way to prove innocence.

HDMI, the Manchurian DRM - a Broadcast Flag dormant until 2010

DTV + HDTV + HDMI + HDCP + DVI = BAD DRM


And as a whole, we've all been dumb enough to accept this treatment.
 
Last edited:
stratman

stratman

Audioholic Ninja
My own experience recently, in the course of research into HDMI DVD players, I noticed something while looking over user reviews of products that they had purchased.

What struck me was the trend that when people complained of poor player performance, it usually had to do with picture and sound trouble, picture would stall or drop out and the like, not quality issues such as mechanical part failure or apparent build quality.

My own suspicion, supported by what I know of HDMI and all the HDCP/DRM crap built into it and also from what this article says about the poor engineering considerations of the format's performance in favor of all the content protection, is that the problems encountered are not manufacturing defects per se, but rather HDMI incompatibility issues from the lousy standard and loads of software glitches from the copy protection schemes.

Electronics manufacturers have unfortunately bought into this standard that benefits no one but the content providers. The manufacturers are eating the costs of the returned products that are mistaken for having a defect when the only defect is the poor quality of the HDMI standard and the inherent HDCP problems. Which then, of course is passed back to the consumer yet again in higher prices, not only to compensate for additional costs of warranty replacement, but also to pay HDMI Licensing, LLC to licensing their faulty standard who's only purpose is to treat every consumer as a potential criminal, guilty with no way to prove innocence.

HDMI, the Manchurian DRM - a Broadcast Flag dormant until 2010

DTV + HDTV + HDMI + HDCP + DVI = BAD DRM


And as a whole, we've all been dumb enough to accept this treatment.
Great analysis David, scarier is the article you linked to boing boing, I want to see what will happen in 2009-10, who will show their first hand.
 

Pesch

Audiophyte
You are off by a factor of 100 in your estimate on the amount of Megapixels per second that would be run on a HDMI cable. Due to compression, you're never actually going to be getting a true 1080P picture on your home TV. Not any time soon anyway - it would take a 1.2 Terabyte Storage device to store a single 2 hour movie in true 1080P.

We can do some simple math and figure out how many Megapixels are actually being sent:

The largest HD Storage device is currently a Dual-Layered BlueRay Disc at 50 GB.

50 GB * 1024 MB/GB = 51200 MB / BlueRayDisc

Now, assuming that 3 bytes are used per pixel

51200 MB * 1 MegaPixel / 3 MB = 17066 MegaPixels / BlueRay Disc

Now assuming that the average movie is about 2 hours long:

2 hours * 3600 seconds / hour = 7200 seconds / BlueRay Disc.

Now combining these two and canceling like terms: ( 17066 MegaPixels / BlueRay Disc ) * ( BlueRay Disc / 7200 seconds ) = 2.3 MegaPixels / second.

This means that the rough maximum of MegaPixels per second that a 50 Gig BlueRay Disc can send over HDMI is around 2.3 MegaPixels / Second - Not 150 MegaPixels / Second like the article says. And even this estimate of 2.3 MegaPixels / Second is a bit high, since it ignores the Audio track on the disc, which would take up another huge chunk of space.

Additionally, this means that the maximum data transfer / second from a BlueRay Disc will be around 7 MB / Second - or 42 Megabits / Second. Well below the 100 Megabit / Second standard for Cat-5 which you mention.

Since no where near as much Information is being transfered over HDMI as you suggest in your article, Inductance and the Twisted-Pair configuration becomes much less of an issue. Even less so than with Cat-5. I fail to see where the problem with HDMI comes from then.

TL;DR: Data Compression makes a HUGE difference on how much information is being sent on HDMI. The math in this article is for how much information would be sent over the HDMI cable on an uncompressed 1080P picture, something which is never used, and the article's result is extremely skewed because of that. You can safely ignore everything said in this article.
 
C

craigsj

Audiophyte
While considerable effort was made to explain this issues involved, the complaint boils down to a rant over the choice of twisted pair. The fact is, though, that coax can be manufactured at 100 ohm impedance so it is not precluded. In fact, there are boatloads of examples where coax has been substituted in cables designed for longer distances. Also, while standard Cat5 may only be certified to 100MHz, that doesn't mean that it can't support more and there are plenty of low cost twisted pair cables certified to much faster frequencies. Shame on the author for being so misleading.

Had HDMI followed the engineering principles that broadcast equipment employ, no one would be able to afford it and it would have been a monumental failure for its intended market. Twisted pair is the correct choice for such a consumer application and to suggest that it is inadequate is ridiculous.
 
K

KurtBJC

Audioholic
You are off by a factor of 100 in your estimate on the amount of Megapixels per second that would be run on a HDMI cable.
That would be good news, if it were true.

HDMI does not carry compressed signals. It carries a pixel-by-pixel stream of the whole image. If the refresh rate is set at 60 Hz, it has to refresh the whole image (1080 x 1920, in the case of 1080p) 60 times per second.

The HDMI organization, in fact, frequently touts this as a GOOD thing, when it really isn't. Again and again, we see HDMI described as being superior to other methods because it carries "uncompressed digital video." Of course, although the signal in the HDMI cable is uncompressed, the source from which it is derived is, in every case applicable to the consumer world, heavily compressed (e.g., HD-DVD, ATSC, Blu-Ray, Mpeg).

The fact that a source recording is compressed does not have anything to do with the signal at the HDMI interface. That signal is a full-blown one-pixel-at-a-time stream. And if you don't believe me, you can download the spec document at www.hdmi.org and see for yourself.

Kurt Denke
Blue Jeans Cable
 
Seth=L

Seth=L

Audioholic Overlord
My own experience recently, in the course of research into HDMI DVD players, I noticed something while looking over user reviews of products that they had purchased.

What struck me was the trend that when people complained of poor player performance, it usually had to do with picture and sound trouble, picture would stall or drop out and the like, not quality issues such as mechanical part failure or apparent build quality.

My own suspicion, supported by what I know of HDMI and all the HDCP/DRM crap built into it and also from what this article says about the poor engineering considerations of the format's performance in favor of all the content protection, is that the problems encountered are not manufacturing defects per se, but rather HDMI incompatibility issues from the lousy standard and loads of software glitches from the copy protection schemes.

Electronics manufacturers have unfortunately bought into this standard that benefits no one but the content providers. The manufacturers are eating the costs of the returned products that are mistaken for having a defect when the only defect is the poor quality of the HDMI standard and the inherent HDCP problems. Which then, of course is passed back to the consumer yet again in higher prices, not only to compensate for additional costs of warranty replacement, but also to pay HDMI Licensing, LLC to licensing their faulty standard who's only purpose is to treat every consumer as a potential criminal, guilty with no way to prove innocence.

HDMI, the Manchurian DRM - a Broadcast Flag dormant until 2010

DTV + HDTV + HDMI + HDCP + DVI = BAD DRM


And as a whole, we've all been dumb enough to accept this treatment.
What do you think the fuss is about enforcing the new copyright protection that has yet to be fully employed?;) If HDMI and HDCP weren't so quirky they would have already set it into the new discs, but they can't seem to get the thing fully compatible.:rolleyes:
 
K

KurtBJC

Audioholic
The fact is, though, that coax can be manufactured at 100 ohm impedance so it is not precluded. In fact, there are boatloads of examples where coax has been substituted in cables designed for longer distances. Also, while standard Cat5 may only be certified to 100MHz, that doesn't mean that it can't support more and there are plenty of low cost twisted pair cables certified to much faster frequencies. Shame on the author for being so misleading.
Yes, coax can be manufactured at 100 ohms. However, it wouldn't be suitable for building an HDMI cable because HDMI signals are run balanced, which requires a symmetrical cable architecture (which, in effect, means twisted pair). It would also be possible to run HDMI by using two 50-ohm coaxes for each balanced line, grounding the shields in common to that pair's shield ground. To do that, though, you'd have to really control the lengths well because skew is a major factor in keeping HDMI signals together.

I have spent a lot of time with engineering staff at Belden on this problem, and I would point out that Belden is, among other things, a world leader in super-high-frequency data cabling, and has a whole range of Gigabit products out. I've seen sales samples of Belden 10-gig cable, though I think it hasn't hit market yet. And you know what? Every last one of the engineers I have ever heard comment on this issue, without any prompting from me, has observed, "it's a shame they didn't just run this in coax." When the guys who know as much as anyone in the world about running high-speed data in twisted-pair cable tell me that, I suspect that my making this point to the world in this article is not exactly "misleading."

Other high-speed data technologies aren't as fragile as HDMI. They are generally not run parallel; they are run with error correction. The fact is, and anyone who's had trouble running this stuff over distance will understand: HDMI is an unreasonably fragile interface, and the problem could easily, and cheaply, have been avoided.

SDI, as is used in the broadcast industry, would not be the only option for consumer use. HDMI could have been done more or less as-is, but instead of four parallel balanced pairs, four 75 ohm coaxes could have been run. Very low attenuation, excellent timing to minimize skew, excellent impedance control to limit return loss--it would simply work better. And if one needed to run it over extreme distance, a cheap breakout box with four BNC jacks and a run of CAT5 to carry the low-speed data and miscellaneous what-not would be all you'd need to extend it considerably, with no active circuitry. Using RG-11 cable you could run it 400 feet with no booster and remain within spec for attenuation. And that likely means you could go quite a bit further, failing spec, but still functional, in most applications.

Now, hardly anybody wants to go 400 feet. But with twisted pair, 50 feet has been a challenge for a lot of manufacturers. And if your cable works today, at 50 feet and 1080p, there's no guarantee that it'll work tomorrow at 1080p/12 bit color, 1080p/16 bit color, or whatever else the next generation of the HDMI standard might throw at it.

Kurt
Blue Jeans Cable
 
Last edited:
Lady Phoenix

Lady Phoenix

Junior Audioholic
The article says:

We have found that, at 720p and 1080i, well-made cables up to around 50 feet will work properly with most, but not all, source/display combinations. If 1080p becomes a standard, plenty of cables which have been good enough to date will fail.

So does anyone know what max length would be likely to work properly is for 1080p in this case? Is it safe to assume 25 ft. is reasonable to expect?
 
B

bleair

Audiophyte
Yes, coax can be manufactured at 100 ohms. However, it wouldn't be suitable for building an HDMI cable because HDMI signals are run balanced, which requires a symmetrical cable architecture (which, in effect, means twisted pair). It would also be possible to run HDMI by using two 50-ohm coaxes for each balanced line, grounding the shields in common to that pair's shield ground. To do that, though, you'd have to really control the lengths well because skew is a major factor in keeping HDMI signals together.

I have spent a lot of time with engineering staff at Belden on this problem, and I would point out that Belden is, among other things, a world leader in super-high-frequency data cabling, and has a whole range of Gigabit products out. I've seen sales samples of Belden 10-gig cable, though I think it hasn't hit market yet. And you know what? Every last one of the engineers I have ever heard comment on this issue, without any prompting from me, has observed, "it's a shame they didn't just run this in coax." When the guys who know as much as anyone in the world about running high-speed data in twisted-pair cable tell me that, I suspect that my making this point to the world in this article is not exactly "misleading."

Other high-speed data technologies aren't as fragile as HDMI. They are generally not run parallel; they are run with error correction. The fact is, and anyone who's had trouble running this stuff over distance will understand: HDMI is an unreasonably fragile interface, and the problem could easily, and cheaply, have been avoided.

SDI, as is used in the broadcast industry, would not be the only option for consumer use. HDMI could have been done more or less as-is, but instead of four parallel balanced pairs, four 75 ohm coaxes could have been run. Very low attenuation, excellent timing to minimize skew, excellent impedance control to limit return loss--it would simply work better. And if one needed to run it over extreme distance, a cheap breakout box with four BNC jacks and a run of CAT5 to carry the low-speed data and miscellaneous what-not would be all you'd need to extend it considerably, with no active circuitry. Using RG-11 cable you could run it 400 feet with no booster and remain within spec for attenuation. And that likely means you could go quite a bit further, failing spec, but still functional, in most applications.

Now, hardly anybody wants to go 400 feet. But with twisted pair, 50 feet has been a challenge for a lot of manufacturers. And if your cable works today, at 50 feet and 1080p, there's no guarantee that it'll work tomorrow at 1080p/12 bit color, 1080p/16 bit color, or whatever else the next generation of the HDMI standard might throw at it.

Kurt
Blue Jeans Cable
UTP works, it's cheap, and it's well understood. Have you taken your concerns to IEEE? You are a member of IEEE, right? Perhaps you went to let them in on the secret of coax before they get to far with all the standards of 802.3 for 10G.

I'll go out on a limb here and claim that all the clusters I use daily with infiniband aren't having the problems you are alluding to. HDMI sucks and it has a huge number of problems, but the need for extra-special-manufactured-wire isn't its biggest issue.
 
C

craigsj

Audiophyte
Yes, coax can be manufactured at 100 ohms. However, it wouldn't be suitable for building an HDMI cable because HDMI signals are run balanced, which requires a symmetrical cable architecture (which, in effect, means twisted pair).
Balanced signals can be run over coax. You need to learn how this stuff really works.

It would also be possible to run HDMI by using two 50-ohm coaxes for each balanced line, grounding the shields in common to that pair's shield ground.
Thank you, Rube Goldberg.

I have spent a lot of time with engineering staff at Belden on this problem...
That does not automatically impart knowledge just as hanging out backstage doesn't make you a musician.

...Belden is, among other things, a world leader in super-high-frequency data cabling, and has a whole range of Gigabit products out.
Yes, and I'd point out that plenty of those products run twisted pair, thank you very much.

When the guys who know as much as anyone in the world about running high-speed data in twisted-pair cable tell me that...
That hasn't been established at all, but go ahead and appeal to your higher authority. Meanwhile, twisted pair still works.

Other high-speed data technologies aren't as fragile as HDMI. They are generally not run parallel; they are run with error correction. The fact is, and anyone who's had trouble running this stuff over distance will understand: HDMI is an unreasonably fragile interface, and the problem could easily, and cheaply, have been avoided.
But that wasn't the subject of the article, was it? None of that has to do with twisted pair, the subject of the unjustified rant.

As for "cheaply", the mere existence of the existing standard proves you wrong. The only reason to go with twisted pair is cost.

HDMI could have been done more or less as-is, but instead of four parallel balanced pairs, four 75 ohm coaxes could have been run.
As I said before, it would be more expensive and would not serve any purpose for the short distances expected of the cable. HDMI is not intended for long runs.

And if one needed to run it over extreme distance, a cheap breakout box with four BNC jacks...
Can still do that now despite what you think. There may be no products on the market that do it but that doesn't mean it's impossible. Coax and twisted pair are just two ways to construct a waveguide. You can convert between them.

BTW, balanced signals don't "know" they're balanced and neither do unbalanced ones. In order to tell the difference you need a ground. No ground...no problem.

bleair said:
I'll go out on a limb here and claim that all the clusters I use daily with infiniband aren't having the problems you are alluding to. HDMI sucks and it has a huge number of problems, but the need for extra-special-manufactured-wire isn't its biggest issue.
Exactly. Infiniband, which I have also worked with, is one of a number of examples where higher data rates are achieved reliably over twisted pair. HDMI is a low cost interconnect for short distance runs. It uses cheap cable and cheap connectors. Imagining that the cause of all it's problems is a glaring engineering failure of one aspect of its design, when that aspect is well established as functional, is absurd.
 
K

KurtBJC

Audioholic
Balanced signals can be run over coax. You need to learn how this stuff really works.
Well, sure. You can disconnect the ground and run the two sides of the balanced line to the shield and the center conductor. If you do that, you will have an increase in crosstalk on an epic scale; you've tossed out common mode noise rejection by making the line asymmetrical and at the same time increased each line's tendency to induce noise in the others by the same means.

Thank you, Rube Goldberg.
If it is a kludgy, crazy answer to split a 100 ohm balanced line into two 50 ohm unbalanced lines, you should drop a note to the guys at Agilent and Tektronix, who make test probes for the officially-sanctioned HDMI compliance test gear. That's what they do.


That does not automatically impart knowledge just as hanging out backstage doesn't make you a musician.
I hate the "appeal to authority" style of argument as much as anyone. But let me point out that I was not responding here only to your contention that I am wrong, but that I am intentionally misleading people. My point, in appealing to authority, is not to suggest that the mere fact that the engineers agree with me means I'm right (it doesn't), but is to suggest that, whether you disagree with me or not, my views are sufficiently well-founded that I ought not to be accused of "misleading" people in stating them.


Yes, and I'd point out that plenty of those products run twisted pair, thank you very much.
Yes. And, as I have pointed out: (1) they do not generally try to run these high-speed data lines PARALLEL, which imposes the difficult requirement of keeping them in time with one another and which consequently makes skew such a problem; and (2) they don't try to do it without having error-correction protocols to catch the inevitable bit errors.



As for "cheaply", the mere existence of the existing standard proves you wrong. The only reason to go with twisted pair is cost.
* * *

As I said before, it would be more expensive and would not serve any purpose for the short distances expected of the cable. HDMI is not intended for long runs.
Well, first, it wouldn't be much more expensive. VGA cable is typically a five-coax bundle, and you'd only need four of those along with the various miscellaneous added conductors to do it. For short runs, this could be done very economically because miniature cable can easily be manufactured with acceptable return loss and timing characteristics.

As for HDMI being "not intended for long runs": I don't know what the people responsible for it were thinking, but it is of course very common for people to want to run it 50 feet in a home theater environment, and 50 feet can be a challenge. Very few of the cables on the market meet spec at 50 feet; they work, when they work, mostly because there's a bit of room in the spec, which assumes "worst-case" input signals and data recovery. And we are constantly dealing with people who run long video runs in churches and auditoriums and the like, for whom 50 feet is nowhere near long enough. Lots of cable now installed out there in people's homes is going to fail when higher-resolution equipment is hooked up to it. And when it fails, I don't think people will find much consolation if someone from the HDMI organization explains to them that it was "not intended for long runs."

The purpose of having a standard that works well over distance is not just to enable it to work well at that maximum distance; it is also to give some "headroom" so that it will be rock-solid at shorter distances. HDMI is lacking that headroom, and it would have been easier to provide. The limiting factors on distance runs primarily are skew and return loss; intrapair skew would be eliminated by going unbalanced, and inter-pair (or, inter-channel) skew would be much reduced by going to coaxial construction because timing can be much more tightly controlled. Return loss, likewise, would be much better controlled in coax because impedance consistency is much better. If HDMI worked acceptably over a few hundred feet, people wouldn't have to scratch their heads and wonder whether it'd work when they try to run 1080p/16 bit color 50 feet.
 
K

KurtBJC

Audioholic
The article says:

We have found that, at 720p and 1080i, well-made cables up to around 50 feet will work properly with most, but not all, source/display combinations. If 1080p becomes a standard, plenty of cables which have been good enough to date will fail.

So does anyone know what max length would be likely to work properly is for 1080p in this case? Is it safe to assume 25 ft. is reasonable to expect?
For current 1080p, 8 bit color, at 25 feet, most likely any reasonably well made cable will do just fine. But these things are, unfortunately, somewhat source-and-display-dependent, so it's hard to say categorically and confidently that something will work at a given distance. I wouldn't worry about it at 25 feet, though, generally speaking.
 
C

craigsj

Audiophyte
Well, sure. You can disconnect the ground and run the two sides of the balanced line to the shield and the center conductor. If you do that, you will have an increase in crosstalk on an epic scale; you've tossed out common mode noise rejection by making the line asymmetrical and at the same time increased each line's tendency to induce noise in the others by the same means.
and then there are twinax cables.

The complaint here seems to be that twisted pair can't offer sufficiently constant impedance. There is Cat6 cable certified to 1000MHz, so citing Cat5 being certified to only 100MHz as evidence that twisted pair isn't adequate certainly is misleading if not outright fraudulent.

If it is a kludgy, crazy answer to split a 100 ohm balanced line into two 50 ohm unbalanced lines, you should drop a note to the guys at Agilent and Tektronix, who make test probes for the officially-sanctioned HDMI compliance test gear. That's what they do.
I'd rather not pay test probe prices for HDMI cabling.

I hate the "appeal to authority" style of argument as much as anyone. But let me point out that I was not responding here only to your contention that I am wrong, but that I am intentionally misleading people. My point, in appealing to authority, is not to suggest that the mere fact that the engineers agree with me means I'm right (it doesn't), but is to suggest that, whether you disagree with me or not, my views are sufficiently well-founded that I ought not to be accused of "misleading" people in stating them.
By quoting heresay from unidentified sources whose expertise is unknown and unconfirmable, you have totally failed to do that. As for your claim that you have not purposefully misled, are you saying that you are unaware that there are Cat5+ certifications well in excess of 100MHz? You do realize that GigE runs on twisted pair, right? You realize that Infiniband does, that 10GigE will, etc? SATA? SAS? Future PCI Express? It is a joke to suggest that twisted pair isn't up to the task, and anyone who claims to be expert yet says so deserves to have his motives questioned.

Yes. And, as I have pointed out: (1) they do not generally try to run these high-speed data lines PARALLEL, which imposes the difficult requirement of keeping them in time with one another and which consequently makes skew such a problem; and (2) they don't try to do it without having error-correction protocols to catch the inevitable bit errors.
Some technologies do run serial bitstreams in parallel. As for lack of error correction, that is not a twisted pair issue, and the design choice understood that the data stream was not critical in nature. Adding error correction increases costs and data bandwidth requirements.

Well, first, it wouldn't be much more expensive. VGA cable is typically a five-coax bundle, and you'd only need four of those along with the various miscellaneous added conductors to do it. For short runs, this could be done very economically because miniature cable can easily be manufactured with acceptable return loss and timing characteristics.
Not all VGA cable is coax. You continue to insist that the cost difference is small but you have no basis for making such a claim. Twisted pair was chosen specifically because it is cheap.

As for HDMI being "not intended for long runs": I don't know what the people responsible for it were thinking, but it is of course very common for people to want to run it 50 feet in a home theater environment, and 50 feet can be a challenge. Very few of the cables on the market meet spec at 50 feet; they work, when they work, mostly because there's a bit of room in the spec, which assumes "worst-case" input signals and data recovery.
I disagree with your assertion that 50 foot runs are "very common". It is relatively uncommon, and I would say they were thinking that those customers are not of special importance. If including the requirements of a fringe group of users doubles the cost of an ubiquitous, high volume interconnect then supporting them would not make sense. It is clear that long runs were not a design goal and you're upset about it. Being upset about it doesn't make you right or somehow tilt facts in your favor. The vast majority of applications for HDMI are short runs. Hell, HDMI itself is hardly ubiquitous.

And we are constantly dealing with people who run long video runs in churches and auditoriums and the like, for whom 50 feet is nowhere near long enough. Lots of cable now installed out there in people's homes is going to fail when higher-resolution equipment is hooked up to it. And when it fails, I don't think people will find much consolation if someone from the HDMI organization explains to them that it was "not intended for long runs."
There's a lot interconnects that aren't suitable for long runs. Perhaps you consider that a market opportunity rather than supporting sham articles criticising engineering work simply because you don't agree with the design goals. Since when did HDMI ever promise to support the needs of your church users? What were they happily using before they were stabbed in the back by the false promises of twisted pair?

The purpose of having a standard that works well over distance is not just to enable it to work well at that maximum distance; it is also to give some "headroom" so that it will be rock-solid at shorter distances. HDMI is lacking that headroom, and it would have been easier to provide. The limiting factors on distance runs primarily are skew and return loss; intrapair skew would be eliminated by going unbalanced, and inter-pair (or, inter-channel) skew would be much reduced by going to coaxial construction because timing can be much more tightly controlled. Return loss, likewise, would be much better controlled in coax because impedance consistency is much better. If HDMI worked acceptably over a few hundred feet, people wouldn't have to scratch their heads and wonder whether it'd work when they try to run 1080p/16 bit color 50 feet.
Just who is scratching their head on that? What is the volume of equipment today offering 16 bit 1080p?

Anyway, all of this is just another rehash of your assertion than twisted pair is inadequate and coax is better. Good engineering is about developing solutions that best address needs, among those needs is always cost. A solution designed to achieve best possible performance regardless of cost is a poorly engineered one and money spent exceeding a spec is wasted money. Good engineers understand problems and don't overdesign products. You are clearly not one of those.

Why don't you simply write an article explaining that you don't like that HDMI has a 50 foot limit? That's what it all boils down to after all.
 

Latest posts


newsletter
  • RBHsound.com
  • BlueJeansCable.com
  • SVS Sound Subwoofers
  • Experience the Martin Logan Montis
Top