HDMI cables and quality.

P

prowl33

Audiophyte
I am trying to find hard evidence via tests to support higher quality HDMI cables.

My method of thinking tells me that there can and is a difference in different quality HDMI cables. I know that of course we are talking 1's and 0's, but we are talking about billions of them a second. Lower quality cables should have a larger margin of error for transmitting that digital signal than a higher quality cable would.

Data through HDMI is different than data though a computer, data through a computer moved at a rate that is probably on average 20 times slower, plus the data doesn't have to be sequential, meaning it can arrive in any order, and the computer verifies that all data is there before it finishes, if the file is missing any data, then often times it is completely corrupt.

With a Television, and Video/Audio signals, we are transferring a lot more data, not only more, but it has to be sent in a specific order, be processed quickly, to produce a live display of audio or video. If some of that data does not make it through the cable, or 1's get interpreted as 0's, then it would result in a slight varience of picture or audio quality.

I tested my logic, mind you by eye and ear, with a couple of different setups. First I used an audioquest 6' Forest cable, hooked in through a $2000 yamaha receiver, playing a music CD through some B&W CM9 towers, listened to a short audio track, I then hooked up an audioquest carbon cable $229, which boasts 5% silver conductors and a silver tip for better quality, and listened again. Now it could just be my mind playing tricks on me, but I heard a slight audio quality difference between the 2 cables, the Carbon sounding a little bit cleaner.

I also did the test via video, 1080p blu-ray movie to a 55" Mitsubishi LED 240hz display. I know the HZ doesn't matter for the amount of data pushed through the cable, but to my knowledge, when the TV is repeating frames multiple times, it is actually easier to see imperfections in the picture itself. So I had a 6' monster 700 series cable hooked up, played a scene of pirates of the carribean, watched the short clip a few times, swapped in the carbon cable, and watched the same clip. Now in both pictures there were amount of digital noise, there always will be, get up close to any HD picture and you see it... but there was less in the 6' carbon cable, the picture actually looked a little sharper and less distorted up close.

Now, I am by far NOT an audio/video expert, but I have been trying to find articles to support my findings and logic, but always find just the opposite. But no one seems to test data integrity from 1 end of a cable to the other, I want to know if 100% of the data makes it through, or less... I have seen the "eye" tester, I think it's even in an article on this web page, and, atleast logically, the fact that the eye looks different on different cables would tell me that different amounts of data are making it through on each one.

So... are my theory and tests correct? Or am I just blinded by my idea, and seeing and hearing what I want to see and hear?
 
M

markw

Audioholic Overlord
It's possible what you report is 100% true. It's also possible that you're realizing exactly what you expect to find.

This is a perfect candidate for some controlled blind testing at least. Perhaps even a double blind test, no puns intended.
 
P

prowl33

Audiophyte
How about just my theory on the whole thing?

I think what I'm going to do is a test that hits 1 end of the spectrum vs the other. Going to hook up a bluray player to a 65" tv using a 9' insiginia cable, very low quality and decent length to it, vs the 6' carbon cable, and see if those polar opposites in tests yield significant results. If they do, then in theory, closer quality cables may yield results, just less noticable. Kind of like when you look at 1 model tv, and the next model up the only difference is a slight increase in contrast (just as an example)
 
M

markw

Audioholic Overlord
Theories are great, but they are still theories.

How about just my theory on the whole thing?

I think what I'm going to do is a test that hits 1 end of the spectrum vs the other. Going to hook up a bluray player to a 65" tv using a 9' insiginia cable, very low quality and decent length to it, vs the 6' carbon cable, and see if those polar opposites in tests yield significant results. If they do, then in theory, closer quality cables may yield results, just less noticable. Kind of like when you look at 1 model tv, and the next model up the only difference is a slight increase in contrast (just as an example)
Now all you have to do is submit proof that what you seem to observe is, in fact, a fact.

Until they are proven to be factual they are just that, unproven theories. To be accepted as a fact (at least in serious circles), they must be proven by testing, and that is a totally another step.

Did I just fall for another troll, the likes of which seem increasingly likely as of late?
 
j_garcia

j_garcia

Audioholic Jedi
There are already about 500 threads about this. There IS NO EVIDENCE. It has already been tested and proven that under ~10', there isn't going to be a meaningful difference between inexpensive cables and expensive ones.
 
P

prowl33

Audiophyte
Not trolling in the least bit. Also I read that article on this website, and in a way, proved the possibility of my theory even more. The data analyzer or the "eye" looks different with different cables, because the different cables are passing through different data, due to signal quality lost from source to tv. How noticable this is is a totally different animal though.

Also, I have looked a dozens of articles, none of which test the data integrity, how much of the data gets from source to receiver or TV, they are testing either a pass/fail or you just see people say that each cable looks good...

I have also yet to see any tests what so ever in the relation of HDMI cable to audio quality, and if a speaker will actually put out different audio through different quality HDMI cables. Also all the tests are fairly dated, back to HDMI 1.3, testing 720p some 1080p, none of which are testing 3d, 3d at 12bit, or 2k x 4k. Also, although the TV itself is doing to 120hz, 240hz, 480hz upscaling, will it upscale a signal taken from a worse cable any different from that of a high quality cable. If you take the same mistake and repeat it to the TV 4 or 8 times, wont it become more distinguishable?

I think all of these are valid questions, and while I don't have the tools or ability to conduct professional tests that would be meaningful to anyone in this forum, I will conduct some people tests with some of the highest quality compared to some of the lowest quality cables, in lengths under 10' to see what it yields.
 
N

Nestor

Senior Audioholic
I am trying to find hard evidence via tests to support higher quality HDMI cables.

My method of thinking tells me that there can and is a difference in different quality HDMI cables. I know that of course we are talking 1's and 0's, but we are talking about billions of them a second. Lower quality cables should have a larger margin of error for transmitting that digital signal than a higher quality cable would.

Data through HDMI is different than data though a computer, data through a computer moved at a rate that is probably on average 20 times slower, plus the data doesn't have to be sequential, meaning it can arrive in any order, and the computer verifies that all data is there before it finishes, if the file is missing any data, then often times it is completely corrupt.

With a Television, and Video/Audio signals, we are transferring a lot more data, not only more, but it has to be sent in a specific order, be processed quickly, to produce a live display of audio or video. If some of that data does not make it through the cable, or 1's get interpreted as 0's, then it would result in a slight varience of picture or audio quality.

I tested my logic, mind you by eye and ear, with a couple of different setups. First I used an audioquest 6' Forest cable, hooked in through a $2000 yamaha receiver, playing a music CD through some B&W CM9 towers, listened to a short audio track, I then hooked up an audioquest carbon cable $229, which boasts 5% silver conductors and a silver tip for better quality, and listened again. Now it could just be my mind playing tricks on me, but I heard a slight audio quality difference between the 2 cables, the Carbon sounding a little bit cleaner.

I also did the test via video, 1080p blu-ray movie to a 55" Mitsubishi LED 240hz display. I know the HZ doesn't matter for the amount of data pushed through the cable, but to my knowledge, when the TV is repeating frames multiple times, it is actually easier to see imperfections in the picture itself. So I had a 6' monster 700 series cable hooked up, played a scene of pirates of the carribean, watched the short clip a few times, swapped in the carbon cable, and watched the same clip. Now in both pictures there were amount of digital noise, there always will be, get up close to any HD picture and you see it... but there was less in the 6' carbon cable, the picture actually looked a little sharper and less distorted up close.

Now, I am by far NOT an audio/video expert, but I have been trying to find articles to support my findings and logic, but always find just the opposite. But no one seems to test data integrity from 1 end of a cable to the other, I want to know if 100% of the data makes it through, or less... I have seen the "eye" tester, I think it's even in an article on this web page, and, atleast logically, the fact that the eye looks different on different cables would tell me that different amounts of data are making it through on each one.

So... are my theory and tests correct? Or am I just blinded by my idea, and seeing and hearing what I want to see and hear?
Your post answered your own question.
 
Last edited by a moderator:
j_garcia

j_garcia

Audioholic Jedi
Not trolling in the least bit. Also I read that article on this website, and in a way, proved the possibility of my theory even more. The data analyzer or the "eye" looks different with different cables, because the different cables are passing through different data, due to signal quality lost from source to tv. How noticable this is is a totally different animal though.

Also, I have looked a dozens of articles, none of which test the data integrity, how much of the data gets from source to receiver or TV, they are testing either a pass/fail or you just see people say that each cable looks good...

I have also yet to see any tests what so ever in the relation of HDMI cable to audio quality, and if a speaker will actually put out different audio through different quality HDMI cables. Also all the tests are fairly dated, back to HDMI 1.3, testing 720p some 1080p, none of which are testing 3d, 3d at 12bit, or 2k x 4k. Also, although the TV itself is doing to 120hz, 240hz, 480hz upscaling, will it upscale a signal taken from a worse cable any different from that of a high quality cable. If you take the same mistake and repeat it to the TV 4 or 8 times, wont it become more distinguishable?

I think all of these are valid questions, and while I don't have the tools or ability to conduct professional tests that would be meaningful to anyone in this forum, I will conduct some people tests with some of the highest quality compared to some of the lowest quality cables, in lengths under 10' to see what it yields.
I would have to say this would be comparable to doing testing to prove that gravity exists.

You can read this also:
http://www.bluejeanscable.com/articles/hdmi-cable-information.htm
 
P

prowl33

Audiophyte
I ran some demonstrations with 3 of my coworkers, using a $40 9' hdmi cable and a $110 6' hdmi cable. In the audio test all 3 of my co workers and myself could point out better sound separation, and the voice being clearer out of the more expensive cable playing acdc highway to hell.

For the video we played scene 7 from pirates of the caribbean, with a demon 1600 bluray player and a 63" samsung c8000 and could see a slight difference in the noise in the background and maybe a little better motion, although it would maybe be easier to tell with 2 setups side by side.

Still curious to see someone do a machined test on it measuring data sent vs data received to see how much it's difference there really is.
 
jinjuku

jinjuku

Moderator
I ran some demonstrations with 3 of my coworkers, using a $40 9' hdmi cable and a $110 6' hdmi cable. In the audio test all 3 of my co workers and myself could point out better sound separation, and the voice being clearer out of the more expensive cable playing acdc highway to hell.

For the video we played scene 7 from pirates of the caribbean, with a demon 1600 bluray player and a 63" samsung c8000 and could see a slight difference in the noise in the background and maybe a little better motion, although it would maybe be easier to tell with 2 setups side by side.

Still curious to see someone do a machined test on it measuring data sent vs data received to see how much it's difference there really is.
Looks like you found your cable then...:confused:
 
mtrycrafts

mtrycrafts

Seriously, I have no life.
....
but I have been trying to find articles to support my findings and logic, but always find just the opposite. .
That is a good clue that your findings are not based in facts. ;):D
 
Pyrrho

Pyrrho

Audioholic Ninja
That is a good article. Here is a good quote:


I have to come away saying that most cables under 4-5 meters will pass just about anything in today's arsenal of 1080p - and that's likely to include Deep Color if and when it ever makes an appearance (not likely soon due to current Blu-ray limitations). For cables over 5 meters it's a good bet that you'll want to stick with trusted manufacturers who deliver on their specs. For long cables, Blue Jeans Cable, DVIGear, MonoPrice, Monster Cable 800HD, and WireWorld seem to be the best bets of the cables we tested - however the price variance between these cables is revealing.​
http://www.audioholics.com/education/cables/long-hdmi-cable-bench-tests/evaluation-conclusion

Since I only use short HDMI cables, I have never had a problem with them, no matter how cheap I've gone with them. And I have some that are really amazingly cheap.

I think the best advice is to avoid long runs of HDMI cables if possible, and if not, pay attention to what that article says about quality cables. And buy cheap HDMI cables for the short runs, as anything else is a waste of money (as is shown in the article).
 
mtrycrafts

mtrycrafts

Seriously, I have no life.
I ran some demonstrations with 3 of my coworkers, using a $40 9' hdmi cable and a $110 6' hdmi cable. In the audio test all 3 of my co workers and myself could point out better sound separation, and the voice being clearer out of the more expensive cable playing acdc highway to hell.

For the video we played scene 7 from pirates of the caribbean, with a demon 1600 bluray player and a 63" samsung c8000 and could see a slight difference in the noise in the background and maybe a little better motion, although it would maybe be easier to tell with 2 setups side by side.

Still curious to see someone do a machined test on it measuring data sent vs data received to see how much it's difference there really is.
Please tell us how you conducted the test. Doesn't sound very good to me.
You should really try to do split screens and still picture frames as a start.
And, why would the cable have such an impact on the audio channels? Hardly the case. Need better data than what you have presented.
 

Latest posts

newsletter

  • RBHsound.com
  • BlueJeansCable.com
  • SVS Sound Subwoofers
  • Experience the Martin Logan Montis
Top