Is digital clipping worth worrying about?

T

tbewick

Senior Audioholic
I bought a couple of CD's last week, Interpol Antics and Franz Ferdinand You Could Have it So Much Better. I think that the sound quality is dreadful on both CD's and I wondered if digital clipping might be one of the reasons why. The tracks I've heard sound slightly better when I hear them on digital TV (it could be because digital TV is 32 kHz so there's less high frequency energy than with 44.1 kHz CD).

Out of curiosity, I digitally extracted some tracks from the two CD's using MusicMatch. To my surprise, the VU meter's clip indicators lighted up on Cool Edit when playing tracks which had no 'possibly clipped samples' on Cool Edit's statistical analysis. I was thinking that to remove any risk of digital clipping, I could extract the tracks and then normalise to a sensible value (I thought 50% would be okay).

Seeing as this would be quite time consuming, does anyone have any views as to whether or not it's worthwhile doing this? I've tried it with a couple of tracks and the VU meter clip indicator no longer lights up. Subjectively, the tracks sound better.

Thanks
 
M

MDS

Audioholic Spartan
Clipping is bad in general, but often it doesn't really affect what you hear unless it is severe. I've never used CoolEdit but I've been using Sound Forge for at least 10 years now and I have extracted and edited quite a bit of music, so here are my thoughts for what it is worth.

1. The clip indicators in most audio editors (I assume CoolEdit would be similar) are simple 'over' counters. Basically that means that it defines clipping as N consecutive samples at max amplitude (0 dB). SoundForge appears to use a value of N=4 because if you generate a square wave at 0 dB, and select fewer than 4 samples, the clip indicators do not light. 4 or more and the clip indicator lights.

2. 4/44,100 is .00009 seconds and is entirely inaudible. I have tried to determine the shortest interval that is audible and the guys on the SF forum (most of whom are audio engineers) seem to think that somewhere around 6 ms is the shortest interval where the human ear can detect changes. Whether that is true or not (probably depends on frequency like so many other things), I really can't say I definitely hear a 'defect' in the sound when the clip indicators light up.

3. The caveat to the above is that on tracks where the clip indicators constantly light because there are LOTS of clipped peaks close together, sometimes it does sound like it is not quite right.

4. Normalizing (peak) simply adds a constant gain value to every sample. If for example, the highest peak is -1 dB and you choose to normlize to 0 dB, it will just add 1 dB to every sample. Likewise, if you normalize down - if the highest peak is 0 dB and you normalize to -1 dB, it will subtract 1 dB from every sample. Normalizing will not restore clipped peaks - the waveform will still be squared off - but now those 4 or however many samples in a row will be below 0 dB and the clip indicator will no longer light.

The only way to truly reduce the clipping is to use a plug-in like SF's Clipped Peak Restoration that basically interpolates the values of the samples to keep them below 0 dB. For small amounts of clipping, it is not worth it in IMO because as I said you can't hear it anyway because the interval is too short.

Instead of clipped peak restoration, you can apply the dynamic compressor or normalize to a specif RMS value (not peak normalization), but considering that the music is already highly compressed, the results often aren't satisying. I tend to just leave it alone.
 
T

tbewick

Senior Audioholic
Cheers MDS for your input.

I found out why the clip indicators were going. It's because DC adjustment in Cool Edit was turned on. This can make the clip indicators incorrectly report clipping. My main reason for messing around was because of those Audioholics articles about DAC's being overloaded by highly compressed music. My thought was that the DAC in my sound card (and receiver) might be overloading on playback, so I applied the normalisation to get the tracks down to ~14 bit (~16 k of sample depth). The distortion I heard at 16 bit on the Interpol album is still present, so there's a good chance that the distortion is the result of limiters being applied on the original recording. The tracks do sound slightly softer at 14 bits, but the difference is subtle and I may just be imagining a difference. It could be that the tracks sound softer because of the decrease in resolution and the effects of dither or truncation artifacts/noise on normalisation.

It's also possible that MusicMatch isn't extracting the tracks accurately, but I don't know of any other decent (free) digital CD extractors.
 
M

MDS

Audioholic Spartan
It's interesting that you mention DC Offset adjustment. A DC offset occurs when the waveform is not centered at zero. It is really an electrical phenemenon that occurs in the recording studio due to slight mismatches in the equipment. Sound Forge calculates DC offset and provides a tool to remove it. I was doing that until I learned how it calculates the offset and how it removes it.

All SoundForge does (other editors may be different) is essentially add up all the samples and divide by the number of samples. If the result is not zero, then it reports the difference as a 'possible' DC Offset. If you choose to remove it, it subtracts the difference from every sample. [There is no real way to calculate an actual offset as I was informed by the SF forum - they actually thought it was comical that I would worry about such things. ;)]

The problem with relying on this method is that if the waveform has a long fade in or fade out or long sections of silence (or very small sample values), the calculation is sure to show a non-zero offset. I can say that every single wav I have ever ripped from a CD shows some non-zero offset - even though the waveform appears to be centered on the baseline and sounds fine to me.

The only CD that definitely appeared skewed above the baseline was an Ozzy Osbourne CD - The Ozzman Cometh. The values SF reported were huge (in the thousands - an order of magnitude greater than most CDs) and it did sound poor. Applying the dc offset removal to those tracks did center the waveform but made it sound even worse. After some research, I discovered that it was a known issue and the CD was re-released (I bought a new one).

Long story short...I no longer use the DC offset removal tool. I feel it is another thing that is not worth worrying about, considering that on playback your sound card might potentially add an offset anyway.

It's fascinating stuff and CoolEdit is one of the highly regarded tools.
 
j_garcia

j_garcia

Audioholic Jedi
tbewick said:
I found out why the clip indicators were going. It's because DC adjustment in Cool Edit was turned on. This can make the clip indicators incorrectly report clipping. My main reason for messing around was because of those Audioholics articles about DAC's being overloaded by highly compressed music. My thought was that the DAC in my sound card (and receiver) might be overloading on playback, so I applied the normalisation to get the tracks down to ~14 bit (~16 k of sample depth). The distortion I heard at 16 bit on the Interpol album is still present, so there's a good chance that the distortion is the result of limiters being applied on the original recording. The tracks do sound slightly softer at 14 bits, but the difference is subtle and I may just be imagining a difference. It could be that the tracks sound softer because of the decrease in resolution and the effects of dither or truncation artifacts/noise on normalisation.
I think the issue is the recording/mastering as you mention, not "clipping" per se. It wouldn't surprise me if that was the case, I guess is what I'm saying, so I wouldn't worry about it.
 
newsletter

  • RBHsound.com
  • BlueJeansCable.com
  • SVS Sound Subwoofers
  • Experience the Martin Logan Montis
Top