CD v. CD-R, Jitter, Error correction, and more
A few concepts, misconceptions, opinions and such...
1) Jitter. Assuming a properly built transport and D/A converter, jitter is a non-issue in playback. Theoretically it would be possible to build a CD player with separate non-synchronized clocks for the read-head electronics and the D/A converter, but as this (incorrect) scheme would be more expensive and complex, it is fairly safe to assume a single internal master clock.
Digital clock jitter is only an issue when 2 digital devices are in communication - one as the source and the other as a decoder - and are not following a common clock source. Most devices (i.e. receivers, pre-pros) are smart enough to detect the clock signal at the active digital input and lock to it.
It is preferable (and the configuration in most professional studios) to have a separate master clock device and all other devices slaved to it, but consumer gear doesn't generally offer us any such option for external master clock.
2) Ripping to CD-R. Since ripping a CD is almost extracting the raw data (more on the "almost" in a bit) jitter is also not an issue. Same with writing back out. Computer clocks (in the GHz range) are of an exponentially higher clock rate than audio gear, so the resolution is far beyond sufficient to handle 44.1KHz, 48KHz, or even 192KHz with errors far smaller than our ears or our audio gear could detect.
2.5) Here's the kicker: Most people are under the misconception that data on a CD somewhat closely resembles an AIFF or WAV audio file. This is absolutely not the case. Most people are also under the impression that a pit on a CD is equivalent to either a "1" or a "0". Also not the case.
2.5.1) Words. Every 16-bit sample encoded on in a WAV or AIFF file (or produced by an A/D converter) is broken into 2 8-bit "words" before being written to a CD-R or encoded onto a master for stamping in a production line.
2.5.2) Pits and Lands. 20 years ago the available optical technology wasn't able to accurately distinguish the trasitions in reflected laser intensity that a word such as "10101010" would produce. In fact, the pits would have had to be roughly twice as large as they are. Some incredible minds at Sony and Philips came up with a scheme to encode every 8 bit word as 12 or 13 bits (my old brain not remembering which) and disallowing 2 changes between a "Pit" and a "Land".
Example: "Pit, Pit, Land, Land" is OK, "Pit, Land, Pit, Pit" is not.
Apparently that wasn't quite good enough. In order to lessen the number of transitions even further, instead of reading the pits and lands themselves as "0"s and "1"s, it is the transition between them that is a "1" and the lack of transition that is a "0".
Example: "Land, Land, Land, Pit, Pit" would equal "0010"
Combining these 2 schemes in the proper order and with the proper algorithms makes the data on CDs.
2.5.3) Error correction. There is a simple CRC correction scheme built into the 13 bits on the CD, so it can usually be reconstructed even if the CD is slightly damaged.
2.6) There and Back Again. Unless you like filling landfills with plastic, take my word on this. If you turn off all subjective enhancement tools in you software/hardware like "Dynamic Range Correction" or "Level Enhancement" or whatever gimmicky names Software Publisher X gave them and make 6 consecutive rips and burns (CD to CD-R1, CD-R1 to CD-R2, CD-R2 to CD-R3, and so on), and then set up a true A/B test, you absolutely will hear a degradation. Very small and (on the first pass) insignificant errors are introduced in the reading of the disc and the conversion from 13 bits to 8 and from 8 to 13. Over the course of successive copies, enough errors are introduced that the CRC cannot completely restore the data, and the sound quality is degraded.
If CD Audio data was truly an AIFF or WAV file that a computer could treat as the same kind of data that's on your hard drive all errors would be corrected during copies, but that's not the case.
-------
>
m!