Do CD-Rs sound better than the original CD?

Rob Babcock

Rob Babcock

Moderator
BTW, I haven't had very good luck with the Melody discs. I've had a few that wouldn't work right in my burners and they seem to die on me after a few months (the terrible truth is that many cheapo discs will see the dye fade & become unretrievable in a very short amount of time, but that's a topic for another thread... ;) ).

I don't even wanna start in on the topic of black CD-Rs vs silver vs gold ones! There might be a difference, but that's a can of worms for another thread, too.

One possible theory I've heard is that the ultra HF stream from a CD-R contains less "hash" than one from a pressed disc. I'll have to find a link to the article I read asserting this- I don't recall the specifics and it's beyond my level of understanding of digital principles.

Lastly, I know that a lot of audiophools that burn on the slowest speed are shooting themselves in the foot. A newer drive is designed to burn at higher speeds and will record more accurately towards the top of the speed range than the bottom. It's like trying to get off the stoplight in 5th gear while towing your boat- you're not in the powerband.

This has been a pretty interesting discussion. Wish I'd have been here the last few months to participate! :rolleyes:
 
A

av_phile

Senior Audioholic
They say the ears are the best judge. I often advice the same thing. If only to get away from any technical entanglements. While the ears can judge for itself, the human ears are the most fallible measuring device. I really would like to read about some technical explanation on why or in what situations a copy should sound better. CD-R duplication is a chemical process. CD duplication via stampers is a mechanical process. But neither process has any inherent superiority over the other, is there? Do you have the link on that technical paper on why CD-R sounds better?
 
U

Unregistered

Guest
Burning a cdr and making a cd are not the same process at all.

We are dealing with very small pit sizes and in the mechanical process I believe that the J-Block principle is at work. Jo Blocks are gauge blocks used in high precision tooling. They are so finely finished that when one is stacked on top of another it cannot be lifted off because there is almost no air between the blocks; the top block must be slid off the lower block.

Because the metal of the cd must be removed from the master plate, the edges of the pits cannot be exactly perpendicular to the surface of the metal. Therefore the shape of the pit is not as it is usually described in descriptions of how a cd is constructed, but actually has a slope. This gradient interacts with the laser and can result in the laser null occurring at the wrong time (jitter)

Because the cdr is moving as the pit is created, it also may have a gradient but it will still be much sharper than a cd with less jitter.

At least that's my interpretation of why cdrs sound better that cd.

Comments appreciated
 
Rob Babcock

Rob Babcock

Moderator
That can't be it- in tests there was no difference at all in measurable jitter, yet the CD-R was still perceived to sound better. It seems that if there is truly an audible difference (which I can't confirm right now nor deny), there must be some other factor at work.

But you're right that a disc pressed from a glass master and a burned disc are two physically different animals. I'm just curious as to which differences may contribute to a difference in sound.
 
U

Unregistered

Guest
Jitter does not exist on the cd itself. Jitter is a timing error that occurs on playback or at the interface between the cd player and whatever it is connected to. Even when it does occur, and is not successfully eliminated by buffering and re-clocking, it is so small that it is inconsequential.

Jitter is always given as the 'reason' for why different cds, players, etc sound poor or different than one another, but it is NOT the culprit.
 
Rob Babcock

Rob Babcock

Moderator
I disagree that jitter is inconsequential, but it's not the X factor in the CD vs. CD-R comparison.
 
D

DJ Oxygen

Guest
CD v. CD-R, Jitter, Error correction, and more

A few concepts, misconceptions, opinions and such...

1) Jitter. Assuming a properly built transport and D/A converter, jitter is a non-issue in playback. Theoretically it would be possible to build a CD player with separate non-synchronized clocks for the read-head electronics and the D/A converter, but as this (incorrect) scheme would be more expensive and complex, it is fairly safe to assume a single internal master clock.

Digital clock jitter is only an issue when 2 digital devices are in communication - one as the source and the other as a decoder - and are not following a common clock source. Most devices (i.e. receivers, pre-pros) are smart enough to detect the clock signal at the active digital input and lock to it.

It is preferable (and the configuration in most professional studios) to have a separate master clock device and all other devices slaved to it, but consumer gear doesn't generally offer us any such option for external master clock.

2) Ripping to CD-R. Since ripping a CD is almost extracting the raw data (more on the "almost" in a bit) jitter is also not an issue. Same with writing back out. Computer clocks (in the GHz range) are of an exponentially higher clock rate than audio gear, so the resolution is far beyond sufficient to handle 44.1KHz, 48KHz, or even 192KHz with errors far smaller than our ears or our audio gear could detect.

2.5) Here's the kicker: Most people are under the misconception that data on a CD somewhat closely resembles an AIFF or WAV audio file. This is absolutely not the case. Most people are also under the impression that a pit on a CD is equivalent to either a "1" or a "0". Also not the case.

2.5.1) Words. Every 16-bit sample encoded on in a WAV or AIFF file (or produced by an A/D converter) is broken into 2 8-bit "words" before being written to a CD-R or encoded onto a master for stamping in a production line.

2.5.2) Pits and Lands. 20 years ago the available optical technology wasn't able to accurately distinguish the trasitions in reflected laser intensity that a word such as "10101010" would produce. In fact, the pits would have had to be roughly twice as large as they are. Some incredible minds at Sony and Philips came up with a scheme to encode every 8 bit word as 12 or 13 bits (my old brain not remembering which) and disallowing 2 changes between a "Pit" and a "Land".

Example: "Pit, Pit, Land, Land" is OK, "Pit, Land, Pit, Pit" is not.

Apparently that wasn't quite good enough. In order to lessen the number of transitions even further, instead of reading the pits and lands themselves as "0"s and "1"s, it is the transition between them that is a "1" and the lack of transition that is a "0".

Example: "Land, Land, Land, Pit, Pit" would equal "0010"

Combining these 2 schemes in the proper order and with the proper algorithms makes the data on CDs.

2.5.3) Error correction. There is a simple CRC correction scheme built into the 13 bits on the CD, so it can usually be reconstructed even if the CD is slightly damaged.

2.6) There and Back Again. Unless you like filling landfills with plastic, take my word on this. If you turn off all subjective enhancement tools in you software/hardware like "Dynamic Range Correction" or "Level Enhancement" or whatever gimmicky names Software Publisher X gave them and make 6 consecutive rips and burns (CD to CD-R1, CD-R1 to CD-R2, CD-R2 to CD-R3, and so on), and then set up a true A/B test, you absolutely will hear a degradation. Very small and (on the first pass) insignificant errors are introduced in the reading of the disc and the conversion from 13 bits to 8 and from 8 to 13. Over the course of successive copies, enough errors are introduced that the CRC cannot completely restore the data, and the sound quality is degraded.

If CD Audio data was truly an AIFF or WAV file that a computer could treat as the same kind of data that's on your hard drive all errors would be corrected during copies, but that's not the case.

-------
>:)
m!
 
Rob Babcock

Rob Babcock

Moderator
A lot of what you say is correct, DJ- but what does anything you said have to do with wether or not a CD-R sounds better than the original?

This isn't a theoretical issue at all. Sometimes people get so caught up in the issue of why something should be instead of whether or not something is. I've already explained that many mastering engineers assert there's an audible difference despite the fact that they can't understand how it could be. You can say placebo if you're a slave to measurements, or you could just smirk knowingly if you're the kind of guy who swears he can hear the diff between different colors of jacket on speaker cables. ;)
 
U

Unregistered

Guest
DJ Oxygen said:
A few concepts, misconceptions, opinions and such...
Very good post but you have a few misconceptions yourself...

DJ Oxygen said:
2.5) Here's the kicker: Most people are under the misconception that data on a CD somewhat closely resembles an AIFF or WAV audio file. This is absolutely not the case.
The data on a cd are not written in the same form as a WAV file, true. But they most certainly do "closely resemble a WAV file". A WAV file is nothing more than a header with info such as bit depth, number of channels, and more followed by 16 bit linear PCM samples. The samples may be signed or unsigned (that info is in the header). The data on a cd are also 16 bit linear pcm but they are not simply written linearly as in a wav file (actually the channels are interleaved on a wav file). The samples are Eight-to-Fourteen (EFM) modulated, interleaved, and a circ code is calculated and appended to each block. The blocks are all interleaved such that adjacent samples are not related to each other. Interleaved blocks are further circ coded. That is why horizontal scratches have only a minor chance of affecting playback because they damage multiple blocks from different tracks. Vertical scratches are much worse, but are usually fully corrected on playback.
DJ Oxygen said:
Some incredible minds at Sony and Philips came up with a scheme to encode every 8 bit word as 12 or 13 bits (my old brain not remembering which) and disallowing 2 changes between a "Pit" and a "Land".
Its called Eight-to-Fourteen modulation. Every 8 bits becomes 14 + circ code appended. It may seem counterintuitive but it actually increases the information density.

DJ Oxygen said:
... make 6 consecutive rips and burns (CD to CD-R1, CD-R1 to CD-R2, CD-R2 to CD-R3, and so on), and then set up a true A/B test, you absolutely will hear a degradation. Very small and (on the first pass) insignificant errors are introduced in the reading of the disc and the conversion from 13 bits to 8 and from 8 to 13. Over the course of successive copies, enough errors are introduced that the CRC cannot completely restore the data, and the sound quality is degraded.
Totally false, unless of course your drive is very inaccurate to begin with. Ripping is difficult because the block address can only get you to within 1/75 second of the actual block (588 samples). That is why programs like EAC read multiple times and compare the results, assuming the rips that match, say 8 out of 16 tries, is the 'correct' data. Modern cd drives now do this themselves with the help of a well designed driver. Yamaha and Plextor drives are particularly noteworthy for being extremely accurate.
The EFM conversion does not lose information at any time.

DJ Oxygen said:
If CD Audio data was truly an AIFF or WAV file that a computer could treat as the same kind of data that's on your hard drive all errors would be corrected during copies, but that's not the case.
Regardless of whether the data were in the same format as a wav file, error correction only occurs during playback, except for those drives that attempt to do the correction themselves. If the drive and/or extraction program you use can do a reasonably good job of correcting any errors during extraction, then the resulting data saved as a WAV file is identical to the cd. No amount of re-ripping the copy will change that unless the drive is inconsistent and gets worse with every rip.
 
D

djoxygen

Full Audioholic
In theory...

Rob Babcock said:
A lot of what you say is correct, DJ- but what does anything you said have to do with wether or not a CD-R sounds better than the original?

This isn't a theoretical issue at all. Sometimes people get so caught up in the issue of why something should be instead of whether or not something is. I've already explained that many mastering engineers assert there's an audible difference despite the fact that they can't understand how it could be. You can say placebo if you're a slave to measurements, or you could just smirk knowingly if you're the kind of guy who swears he can hear the diff between different colors of jacket on speaker cables. ;)
It has everything to do with it. "Theoretically", a CD-R can only sound exactly as good as the original. And in the real-world experience of my own (mastering engineer) ears, the theory is better than the reality.
 
D

djoxygen

Full Audioholic
Misconceptions...

Unregistered said:
The data on a cd are not written in the same form as a WAV file, true. But they most certainly do "closely resemble a WAV file". ...
Its called Eight-to-Fourteen modulation. Every 8 bits becomes 14 + circ code appended. It may seem counterintuitive but it actually increases the information density.
Thanks for clarifying and expanding on my not-quite-complete knowledge of the process, and also for reinforcing my claim that the pattern of bits on a CD bears little resemblance to the corresponding pattern of bits representing the same audio on a hard drive. More below...

Unregistered said:
Totally false, unless of course your drive is very inaccurate to begin with. Ripping is difficult because the block address can only get you to within 1/75 second of the actual block (588 samples). That is why programs like EAC read multiple times and compare the results, assuming the rips that match, say 8 out of 16 tries, is the 'correct' data. Modern cd drives now do this themselves with the help of a well designed driver. Yamaha and Plextor drives are particularly noteworthy for being extremely accurate.
The EFM conversion does not lose information at any time.
I was not trying to claim that that the EFM process itself loses information, but that (as you explain) the hardware/firmware drivers and the extraction software must be perfect in order to reconstruct an audio file that exactly matches what was originally EFM'd to create the master. It is possible that both of those criteria are not met and the file may not be perfectly reconstructed. (Incidentally, I have always sworn by Yamaha drives for the highest quality ripping and mastering.)

Unregistered said:
Regardless of whether the data were in the same format as a wav file, error correction only occurs during playback, except for those drives that attempt to do the correction themselves. If the drive and/or extraction program you use can do a reasonably good job of correcting any errors during extraction, then the resulting data saved as a WAV file is identical to the cd. No amount of re-ripping the copy will change that unless the drive is inconsistent and gets worse with every rip.
Essentially this was what I was trying to communicate. With a perfect driver and extraction software, one could make a perfect copy, but certainly no better, and in a greater-than-zero number of real-world cases, the copy will be inaccurate in a way that degrades the sound.

I believe it is theoritically possible that some combination of CD-R dye-color/reflectivity and CD drive/player hardware may return a more easily de-EFM'd and de-CRC'd data stream, which in turn *may* sound better than the original. However, unless any of those making the claim that very CD-R they rip/burn is better than the corresponding original have set up a double-blind A/B test where they can correctly identify the CD-R as better than the original more than 50% of the time, I will remain a skeptic.
 
Rob Babcock

Rob Babcock

Moderator
So I gather you are saying "no", a CD-R can't sound better than the original then? Geez, a simple yes or no would do.

I'm not saying it can or can't sound better or different, but I will say that when preconceived notions or theories run smack up against experience, the truth will prevail. You claim to be an engineer- so be it. Then you must certainly realize there are other mastering engineers who do believe CD-Rs sound better.

I'll say only this, then leave you to your own opinion: there's a universe of things we don't know about sound & digital artifacts. I don't claim any secret knowledge, but a lot of what we think we know often turns out not to be so. Theory is just that- theory. Reality is irrefutable. I'd like to see a 2Xblind test to see if there really is a diff. I believe if you can hear a difference blind, then it's back to the drawing board for a better theory. IMOHO, if you can't tell a difference blind, then the difference doesn't exist or is inaudible (at least to that listener).
 
D

djoxygen

Full Audioholic
Rob Babcock said:
Geez, a simple yes or no would do.
I tend toward the wordiness, and I apologize.

Rob Babcock said:
Then you must certainly realize there are other mastering engineers who do believe CD-Rs sound better.
Actually, before joining this discussion, I hadn't heard anyone make that claim, and I thought it was an interesting one.

Rob Babcock said:
there's a universe of things we don't know about sound & digital artifacts... Theory is just that- theory. Reality is irrefutable. I'd like to see a 2Xblind test to see if there really is a diff. I believe if you can hear a difference blind, then it's back to the drawing board for a better theory. IMOHO, if you can't tell a difference blind, then the difference doesn't exist or is inaudible (at least to that listener).
On all these points we are agreed. I have seen so much evidence over the years that the brain is more of an influence on hearing than any other mechanical or organic part of the process. If anyone prefers the sound of a CD-R to that of a manufactured CD, then they should immediately copy every CD they get.

My final statement of this topic: Any of us claiming that a personal preference equals "better" is a dangerous thing to do.
 
U

Unregistered

Guest
djoxygen,
Nothing wrong with wordiness, this stuff is complicated and sometimes we have to write alot to cover everything. You were pretty much spot-on with your post, just a few things I wanted to clarify - and the thing about Jitter - you have that exactly right, wish we could get the word Jitter out of everyone's minds. It just doesn't matter.

I do agree that after multiple rips quality can go downhill, just that as we seem to agree, a decent combination of drive/firmware and extraction software can go along way towards minimizing that.

..and so we are totally seeing eye to eye on this...
AFTER the data are ripped from the cd by following the long winding road of interleaved data blocks and checking/correcting circ errors, the result is a WAV file. :)
 
Rob Babcock

Rob Babcock

Moderator
I was referring to an article of John Atkinson's in Stereophool, er, Stereophile. He mentions demonstrating the superiority of CD-R to an astonished mastering engineer friend who just kept muttering "I can't believe it." JA said he hears the diff and while he can't explain it, he proposes some possible reasons (none of which I could comprehend- I sorta understood it while I was reading it, but an hour later that faint glimmer of understanding disappeared like smoke).

Some guys do copy all their discs for this reason, and of course on black CD-Rs. For some reason some claim they sound better than silver ones. I know, I know- about now BS alarms are going off all over this site, but I'm not inventing this stuff, just reporting it! ;)

For the record, I'm agnostic on the topic. I cant tell a diff, but I haven't ever done a controlled, level matched A-B compro, nor is my humble audio system and/or ear-brain combo sufficiently resolving. Obviously, many "differences" exist only in the mind; the only way to tell the real from the illusory is to rigorously test them. I'd love to participate in such a test, but I imagine it could lead to some bruised egos. How many Golden Earred Audiophools would be willing to publicly prove they can reliably tell amps or cables apart? I'm guessing the list would be short & the excuses legion! :rolleyes:
 
Rob Babcock

Rob Babcock

Moderator
DJO, you make another excellent point- a difference or preference doesn't necessarily equate to something being better. Many beloved peices of gear add tons of euphonic colorations to the sound, adding "air", "bloom", "action", etc. But the things they add may not necessarily be in the recording.

If two pieces of cable, one costing $5 and the costing $2000, are shown to actually sound different enough to detect, why is it automatically assumed that the $2000 is the better of the two?

Dan Banquer has posted a link to the Cable Shootout II over at my main haunt, AudioCircle. I'm afraid his scientific & objective views of cable will get him nowhere over there. Most of those guys believe every cord, cable & cap contributes its own unmistakeable sound. Me, I'm Cable Agnostic, tending towards scepticism. I can't afford $500 ICs, but I split the diff & hedge my bets by going a bit over Monster Cable. I use mostly Zu Cables and some I've made. Zu might make some really spendy cables, but they have some cheaper ones, too. They're well made & the look really sweet. That's gotta help the sound, right? :D
 
U

Unregistered

Guest
The few articles I've seen on the topic with regard to mastering engineers were exactly the opposite. They recorded directly to digital tape and transferred it to the hard drive of a DAW. They then burned the image to a cd-r, re-ripped it and compared the bits to those on the hard drive. They were identical. Nonetheless, they believed they heard a difference and that the cd-r version was inferior to the original.

Naturally they could not make any sense of this. How could the identical bits sound different on playback? Maybe it was their pre-conceived notion that it would and they really wanted it to sound worse. My assumption is simply that it was the playback chain. Just because you have two files on a disc that are identical doesn't mean they will be read off the disc in the same order with perfect timing. Disks get fragmented and other processes momentarily interrupt data transfer that might introduce a barely perceptible change to trained ears.

The same scenario could apply when playing the cd-r. The only way I think a cd-r COULD actually sound better than an original pressing is if the original media were very poor quality to begin with and during extraction nearly all errors were corrected and the result burned to a cd-r blank with a lower bler. Then the error correction system wouldn't work too hard on playback.
 
R

ruadmaa

Banned
Do CDR's Sound Different

When digital audio first came out it was highly touted that all copies made would sound EXACTLY the same as the original master. In analog copying every subsequent copy loses quality. When vinyl was king, it was pretty much standard practice that by the time an original tape was copied down to vinyl for pressing it was 5th generation analog with much quality loss from the original.

So in answer to the original question, NO every digital copy should be exactly the same as the original irregardless of the color of blank disk you are copying on.
 
U

Unregistered

Guest
Naturally any copy of a digital file will be identical and thus will sound the same, unless something in the playback chain influences the sound as I tried to describe above.

Ripping from a CD and then burning to a cd-r SHOULD be identical and thus sound the same, but as we've been discussing there are a few potential issues that may arise during the extraction process.

And you are definitely right about the 'color' of a cd (not the dye formulation). A 680nm laser as used by a cd player won't be affected by any color under the rainbow. I laugh at the 'green marker' tweaks that would proclaim otherwise.
 
L

lanecoveking

Guest
My experience with CD-R

I have tried copying some very good CDs to ordinary silver CD-R, Melody Black Diamond, Verbatim Viny CD-R (black) and Verbatim Viny CD-R with the outer and inner edges blackened. I compared them using mainly smooth jazz like Diana Krall and Eddie Higgins, especially with Diana Krall I listened closely to the vocal and the piano for differences. It seems to me, where the sound stage of the original CDs may be flatter, when it is copied to CD-R the vocal stands out better with better resolution and less glare, especially the Melody Black Diamond. But the Melody Black Diamond seems to slightly colourize the recordings. Being unable to get more Melody Black Diamond (in Australia), I tried the Verbatim Viny CD-R (black with blue data surface), it is slightly better than ordinary silver CD-R. Then I tried blackening the outer and inner edges & circle of them, the vocal seems to stand out even better and better resolution. I blind tested it, I could tell the difference. I think blackening the outer and inner edges and circle does help a lot.

These may sound crazy. It is up to you to believe it or not.
 
newsletter

  • RBHsound.com
  • BlueJeansCable.com
  • SVS Sound Subwoofers
  • Experience the Martin Logan Montis
Top