Can an amplifier clip with no (or little) distortion?

D

Dazed_and_confused

Audioholic Intern
Trying to understand just what causes an amplifier to clip. I have Sony STR-DN1050 amplifier. My Pioneer Andrew Jones FS-52 speakers are somewhat inefficient at a stated sensitivity of 87dB with 6 ohm impedance. (They do sound great, however!) Sound & Vision specs this amp at 86 watts with 1% distortion driving 5 channels into 8 ohms. From the graph, it appears the amp reaches a maximum distortion of 2%. Is this enough to damage my speakers if I turn the volume up near max? I was under the impression that amplifier distortion was the cause of speaker damage. Feel free to point out what I am mis-understanding...

http://www.soundandvision.com/content/sony-str-dn1050-av-receiver-test-bench
 
M

markw

Audioholic Overlord
Oh, it can probably make more than 2% distortion if you really push it. But, the good news is that before any damage occurs you can hear it.

As has been said so many, many times before, if it starts to sound bad, turn it down immediately and you should be fine. Of course, if you persist in doing that, well...
 
Seth=L

Seth=L

Audioholic Overlord
The maximum 2% is probably a threshold of what the magazine feels is relevant to test. As Mark put it, the receiver can far exceed that number, but you wouldn't want to do that. Based on the bench performance that is shown there the receiver should have no problem with those speakers unless you're literally trying to knock down the house.
 
D

Dazed_and_confused

Audioholic Intern
Thanks for the replies. I'm going out on a limb here, but tell me if I'm thinking correctly...Let's say I'm listening to an audio CD (just good ol' 2 channel stereo). My floor speakers are rated at 87dB sensitivity. These speakers are, however, rated (conservatively) as 6 ohm speakers. I adjust the sensitivity down to 85.75. My speakers are rated at a maximum output of 130 watts. This corresponds to a 106.8 dB. My amp is rated to deliver 129.9 watts at 1% distortion in 2 channel mode. I have a Galaxy CM-140 SPL meter. I typically listen at a distance of 2 meters from the speakers...Factoring in a 6dB drop, as long as my SPL meter doesn't jump above 100dB, should I be safe from damage to my amp and speakers?

I realize 100dB is very loud and capable of causing permanent hearing loss and that no more than 15 minutes at that level is safe during a 24 hour period. I'm more concerned with transient peaks (i.e. classical music) and being reasonably sure my amp and speakers can handle it...
 
G

GIEGAR

Full Audioholic
Dazed, you generally appear to have it by the short 'n' curlies, except for the Inverse Square Law of 6dBSPL of attenuation per double distance. The 6dB figure is applicable to outdoors and anechoic rooms. In typical living rooms, SPL attenuates at what equates to 3dB to 4dB net of room reinforcement. At a 2m listening distance, you're looking at 4dB of attenuation at most.

Playing a coherent signal, a stereo pair of speakers will need to produce 97dBSPL each for your desired 100dBSPL peaks at the listening position. (Doubling of sources yields a 3dB gain.) Therefore each speaker will need to be producing 101dBSPL (97 + 4) peaks at 1m from the baffle. At a conservative 85dB/1W/1m sensitivity, each AJ Pioneer will need 16dBW (dB reference 1W) of gain from the amp or a 40W burst from each channel. (Using this tool.)

According to your S&V link of the measurements, this is well within the linear range* of the Sony's amps with two channels driven, so there's no problem there.

A 40W burst is well below the AJ Pioneer's maximum power handling figure, so there should be no real issues there either. I would suspect however, that there may be a degree of power compression setting in, where the level of acoustic output does not increase at the theoretical rate determined from the input power calculations.

A kicker to be aware of with amp selection though is that amp's specified continuous ("RMS") power ratings are measured with sine waves with a 3dB crest factor (average to peak). So we can legitimately reduce the peak amp power by 3dB (or half) to determine the minimum equivalent RMS rating. So in in the above "100dBSPL peak" scenario, 20WRMS per channel is indicated. (Alternately, you could consider that the 3dB of headroom is already "built in" to the calculation.)

* Below the knee on the graph at approx. 65W.
 
mtrycrafts

mtrycrafts

Seriously, I have no life.
... My floor speakers are rated at 87dB sensitivity. These speakers are, however, rated (conservatively) as 6 ohm speakers. I adjust the sensitivity down to 85.75. [?QUOTE]
Small correction about terms used, or large,;), sensitivity is fixed, not adjustable. It is measured with either 2.83V at speaker terminal for 8 Ohm speakers at 1 m distance and SPL is measure, or 2V for 4 Ohm speakers although some use the same 2.83V for both but then the reader needs to understand this because it changes the equation of the power applied to the speaker is not equal.




. My speakers are rated at a maximum output of 130 watts. This corresponds to a 106.8 dB.
Well, it is not the output that is 130 watts as that is measured in dB. But, its capacity to handle this amount of power without getting into trouble. But, keep in mind that each driver/speaker in a cabinet has much different power handling capability. Don't expect the tweeter to handle 130 watts, far from it; lucky if it will handle 10 watts before frying.
 
slipperybidness

slipperybidness

Audioholic Warlord
To answer the original question

Can an amplifier clip with no (or little) distortion?

Nope. Clipping IS distortion.
 
highfigh

highfigh

Seriously, I have no life.
Also, 1% THD means the distortion components will be -20dB WRT the amplified signal, which means that unless specific conditions are met, that amount of distortion won't be audible. We aren't as sensitive to a lot of the distortion we hear as we would like to believe. Inter-Modulation distortion is usually more noticeable, because of the sum & difference frequencies that result. It's most noticeable when the music doesn't have a lot of instrumentation and notes being played simultaneously.
 
P

PENG

Audioholic Slumlord
To answer the original question

Can an amplifier clip with no (or little) distortion?

Nope. Clipping IS distortion.
I agree, clipping will result in distortion, though distortion does not always involve clipping, but that's not the OP's question. Of course I know you know that, this is just for the OP in case he doesn't know.:D
 
Swerd

Swerd

Audioholic Warlord
The OP's question is actually tough to answer, if you re-word it a bit:

How much amplifier clipping results in audible noise or distortion through the speakers?

I once saw a demonstration of clipping in audio playback, where a good effort was made to measure power out put and to estimate the percent clipping. The result was that no one heard clipping at low percentages, such as 1-5%. When I could clearly hear it, it was much greater than 10%. And, of course, this ability to notice clipping varied widely with the music selection.
 
mtrycrafts

mtrycrafts

Seriously, I have no life.
The OP's question is actually tough to answer, if you re-word it a bit:

How much amplifier clipping results in audible noise or distortion through the speakers?

I once saw a demonstration of clipping in audio playback, where a good effort was made to measure power out put and to estimate the percent clipping. The result was that no one heard clipping at low percentages, such as 1-5%. When I could clearly hear it, it was much greater than 10%. And, of course, this ability to notice clipping varied widely with the music selection.
Yes, the more complex the music, unlike a single instrument like a flute, the harder to hear distortion especially in the low frequency spectrum.
 
3db

3db

Audioholic Slumlord
Yes, the more complex the music, unlike a single instrument like a flute, the harder to hear distortion especially in the low frequency spectrum.
I would think its easier to pick out distortion from classical music and that ilk then it would be from screaming distortion ridden electric geetars :p
 
mtrycrafts

mtrycrafts

Seriously, I have no life.
I would think its easier to pick out distortion from classical music and that ilk then it would be from screaming distortion ridden electric geetars :p
Well, if those are the two choices, you could be on the money. :D
 
highfigh

highfigh

Seriously, I have no life.
I agree, clipping will result in distortion, though distortion does not always involve clipping, but that's not the OP's question. Of course I know you know that, this is just for the OP in case he doesn't know.:D
Clipping IS a distortion of the waveform. Whether it's the "chicken or the egg", depends on the amp design although the clipping could be caused by a power supply that is faulty or under-designed, even if the input voltage isn't high enough to overdrive that section under normal circumstances.
 
highfigh

highfigh

Seriously, I have no life.
I would think its easier to pick out distortion from classical music and that ilk then it would be from screaming distortion ridden electric geetars :p
Depends- if the guitars are loud but clean, the distortion caused by an amplifier used for reproduction won't make it sound like a guitar that's played through a distorting amp. Big difference- in the case of a tube guitar amp, the distortion can be in the preamp, tone stack, phase inverter or the output and some have a master volume control, specifically to allow different kinds of distortion in different stages. In the case of a solid state guitar amp, nobody usually wants to hear that, but a few will distort without making everyone want to fill their ears with molten lava, just to make it stop. Most sound really bad, though. It's the reason tube guitar amps came back as strongly as they did after the US-based tube manufacturers ceased production and a large supply of high quality, reliable tubes dried up. They couldn't get enough tubes so they tried using transistors, with varying levels of success (mostly bad).
 
Last edited:
P

PENG

Audioholic Slumlord
Clipping IS a distortion of the waveform. Whether it's the "chicken or the egg", depends on the amp design although the clipping could be caused by a power supply that is faulty or under-designed, even if the input voltage isn't high enough to overdrive that section under normal circumstances.
Clipping is a form of distortion of the waveform that typically happen when an amplifier is over driven. Distortion does not automatically imply clipping so it is not really a case of chicken and egg. The OP's question has been answered regardless so we can leave it at that..:D
 

Latest posts

newsletter

  • RBHsound.com
  • BlueJeansCable.com
  • SVS Sound Subwoofers
  • Experience the Martin Logan Montis
Top