There's no such thing as digital

F

fmw

Audioholic Ninja
i think the problem with companies like benchmark, who seem to be very engineer driven is that they get trapped into needing to play the same BS marketing game of "higher sampling" and "less jitter" if they want to stay competitive and sell any dac's. for a long time benchmark's dac was 24/96 when everybody else's was 24/192 and it measured just as well as the competition. (so just be smart and buy the older model for less money).
Or better yet, use the DAC that is already installed in your BD player or receiver. All the DAC's use oem conversion chips so they all do the same job in the same way. There is no magic to DA conversion.
 
cpp

cpp

Audioholic Ninja
To be fair, jitter does exist and is measurable. But the delays are in pico seconds and are not even close to being audible. If jitter were audible, it would be bad enough that data transmission would be impossible. The audiophile community has used it to justify things like expensive digital cables.
Of course it exist and yes it's measurable. When I worked at Bell Labs, Lucent we measured it but overall can normal listeners hear when it occurs and does jitter actually take away from the enjoyment of music when it as you noted it occurs in pico seconds. So if a jitter error occurs in one millionth of one millionth of a second which is what a pico second is or even higher in time does it really matter to a listener. I say Nope, superman maybe but not normal humans who happen to enjoy music.. Manufacturers of DAC's will talk about new improvements to reduce jitter, sure they have to or lose sells as they compete for market... We all get caught up with 'the next best thing' and some pay for it regardless be it, AVR's speakers, cell phones, TV's, camera's etc....
 
Last edited:
Irvrobinson

Irvrobinson

Audioholic Spartan
To be fair, jitter does exist and is measurable. But the delays are in pico seconds and are not even close to being audible. If jitter were audible, it would be bad enough that data transmission would be impossible. The audiophile community has used it to justify things like expensive digital cables.
That's not how jitter on PCM signals works. You're thinking of jitter on the underlying analog signaling that represents the bit stream. It's true that if jitter in the electrical or optical analog signaling exceeds a certain level you could get transmission errors, but jitter at the PCM level manifests itself as distortion unrelated to the reconstructed analog signal, which is bad. The theory had always been that since PCM jitter-induced distortion is random it will be far more obtrusive than classic analog harmonic distortion. I'm thinking that even if there was a problem in the early days of ladder DACs, and that might explain some strong preferences I used to have back then, but DAC chips are so good these days that pretty much any distortion is at -95db or below. I'm a skeptic that anything at -95db, even a fog horn, is audible.
 
F

fmw

Audioholic Ninja
OK, I'll accept your input and thank you for it. We went through a lot of trouble to test it and couldn't find a trace of audibility in our tests. Responses were almost exactly 50/50 between relatively high jitter and low jitter units. I can confirm your skepticism unless something has changed dramatically in the past several years.
 
newsletter

  • RBHsound.com
  • BlueJeansCable.com
  • SVS Sound Subwoofers
  • Experience the Martin Logan Montis
Top