I am trying to find hard evidence via tests to support higher quality HDMI cables.
My method of thinking tells me that there can and is a difference in different quality HDMI cables. I know that of course we are talking 1's and 0's, but we are talking about billions of them a second. Lower quality cables should have a larger margin of error for transmitting that digital signal than a higher quality cable would.
Data through HDMI is different than data though a computer, data through a computer moved at a rate that is probably on average 20 times slower, plus the data doesn't have to be sequential, meaning it can arrive in any order, and the computer verifies that all data is there before it finishes, if the file is missing any data, then often times it is completely corrupt.
With a Television, and Video/Audio signals, we are transferring a lot more data, not only more, but it has to be sent in a specific order, be processed quickly, to produce a live display of audio or video. If some of that data does not make it through the cable, or 1's get interpreted as 0's, then it would result in a slight varience of picture or audio quality.
I tested my logic, mind you by eye and ear, with a couple of different setups. First I used an audioquest 6' Forest cable, hooked in through a $2000 yamaha receiver, playing a music CD through some B&W CM9 towers, listened to a short audio track, I then hooked up an audioquest carbon cable $229, which boasts 5% silver conductors and a silver tip for better quality, and listened again. Now it could just be my mind playing tricks on me, but I heard a slight audio quality difference between the 2 cables, the Carbon sounding a little bit cleaner.
I also did the test via video, 1080p blu-ray movie to a 55" Mitsubishi LED 240hz display. I know the HZ doesn't matter for the amount of data pushed through the cable, but to my knowledge, when the TV is repeating frames multiple times, it is actually easier to see imperfections in the picture itself. So I had a 6' monster 700 series cable hooked up, played a scene of pirates of the carribean, watched the short clip a few times, swapped in the carbon cable, and watched the same clip. Now in both pictures there were amount of digital noise, there always will be, get up close to any HD picture and you see it... but there was less in the 6' carbon cable, the picture actually looked a little sharper and less distorted up close.
Now, I am by far NOT an audio/video expert, but I have been trying to find articles to support my findings and logic, but always find just the opposite. But no one seems to test data integrity from 1 end of a cable to the other, I want to know if 100% of the data makes it through, or less... I have seen the "eye" tester, I think it's even in an article on this web page, and, atleast logically, the fact that the eye looks different on different cables would tell me that different amounts of data are making it through on each one.
So... are my theory and tests correct? Or am I just blinded by my idea, and seeing and hearing what I want to see and hear?