<table border="0" align="center" width="95%" cellpadding="0" cellspacing="0"><tr><td>
Dan Banquer : <font color='#000000'>Chuck;
Try the same test at -60 db down from full scale, not -10 db down.
d.b.</font>
<font color='#000000'>I did measure at -50dB; My level control gets too touchy to hit -60, but... at 50 dB down rather than 10 dB down, the signal to noise ratio is reduced 40 dB, and the THD+N increases accordingly.
I swear it almost looks like I'm testing the same internal loop in both every case, so I keep having to disconnect my antenna wire cable to make sure that it's actually in the loop. I've tried, and I can't even force the SPDIF cable misbehave. If it's transferring a signal without so many errors that there is no link, then the measurements are identical. (Naturally they're different when there is no signal due to errors or a broken connection.
)
Dan, please keep in mind that I'm not representing these results as something we might expect with any given CD or DVD player. All I'm saying (and trying to show) is that if the system is well designed, jitter at the serial input doesn't have enough of an effect on the conversion clock to worry about. You agree that jitter only matters at the conversion point, and all I'm doing is using a converter with a clock that is clean at the conversion point. It's well (or should I say "properly?"
isolated from disturbances elsewhere in the system. If it weren't, we'd see changes in the THD+N with that nasty SPDIF mess I used in the last test. Seriously Dan, I really expected to see *something* in that last crazy test, but nada.
I have no doubt that you've seen what you've seen, so some equipment must have more problems than other equipment. Actually, that's not much of a surprise. Haven't I shown pretty clearly that a crummy SPDIF cable doesn't necessarily cause an increase in THD+N due to increased jitter at the conversion point? Take note of my use of the word "necessarily." That leaves leeway for lots of things like design errors and corner cutting.
I really don't have any idea of what I might be looking for here. If the clock is jittered excessively at the point of conversion we'll see it in the measurements, but I know going in that I'm using a clock that won't be affected by such things. It makes me feel like I'm playing with loaded dice. The only way I can lower my S/N with jitter would be to go in and inject something at the point of conversion. The DAC I'm using is well designed, and the clock has very low jitter at the conversion point. It's also sufficiently well isolated from the other clocks and signals, and frankly I don't see how a clock can be said to be clean if it's not well and thoroughly isolated from common external disturbances. The whole thing here is just what you said a while back. As long as I keep the clock at the conversion point clean we're not going to see anything in these measurements. We both know that, but it seems that you expect some combination to eventually corrupt my converter. That would be a monumental surprise to me, but it wouldn't be the first.
Think of it this way Dan. I'm cheating, because you ain't a-gonna jitter my conversion clock, no way, no how. The jitter will always be low enough to give the noise floor you see in the measurements, as long as the thing sees a usable or recoverable data stream. Since we agree that as long as I keep the clock clean at the conversion point, and since I have every intention of keeping it clean at the conversion point, we are really just spinning our wheels. There is no reason for a good audio component to have more problems with digital audio than a modestly priced sound card has, and that's all I'm saying. If I can do it in my PC for a few bucks, using common chips found in home audio gear, it should be at least as easy to do it in a stand-alone digital audio player. The PC environment is a not nastier than the typical audio environment.
Did you know that TI/BB has actually targeted some of their cutting edge audio products at the computer industry rather than the audio industry? They are more interested in getting a quick return on their investment than they are in advancing the state of the audio art. It isn't the engineers that feel that way, but simply a result of the corporate business model, and the state of the market.
This all makes me wonder if a really cheap soundcard, if carefully selected, might not actually outperform some of the DACs in some of the more expensive receivers. You guys see a LOT more consumer equipment than I'll ever see. Is it really all that bad? I thought I had a jaded view of the audio industry, but I've managed to keep myself believing that there are still a few competent designs and designers out there. Read the app notes and design a converter. If you can follow instructions and don't cut corners you can do what I'm doing at low cost. For free if you register and can qualify for evaluation samples from TI. I suspect that the latest data from Crystal is equally complete, so how come we have all these problems cropping up in audio gear? It's mind blowing, Dan, because half the time it's the result imagined problems, and the other half it's the result of bad design decisions, and I just don't get it. (Sorry for that rant, but I really DON'T get it.)
I wonder if there would be any merit in stress testing the DACs in receivers. The DAC should do what my DAC does, simply ignore jitter in the serial input data stream, and that should be pretty easy to test. Just use a crummy cable with a high VSWR and clobber the waveform just short of causing unrecoverable bit errors, then compare the THD+N figures with what you measured with a high quality (and short) optical or coaxial digital cable. Any difference would (or might) indicate that the DAC is unnecessarily. I for own would like to know how common such "problems" actually are.
See ya,
Chuck</font>