PENG said:
I don't disagree with you for the most part but as fuzz mentioned, it is also possible that an external amp could provide "cleaner" power at low level output from say 0 to 1 to 2W.
Cleaner power, as in lower distortion for a given output? Why would the Anthem not deliver clean enough power between 1-2W? Distortion is a function of output power - at 1-2W, which is a fraction of the output power of any modern amp, the distortion would be microscopic.
Here is the Anthem MRX300 power output results on the test bench :
Five channels driven continuously into 8-ohm loads:
0.1% distortion at 71.4 watts
1% distortion at 83.4 watts
It can deliver 93 watts into 2 channels (8 ohm) 0.1% distortion.
If I reduced the power from 93 watts to 50 watts, into 2 channels, you can add in a few leading zeros to the distortion figure.
Hypothetically speaking, if the Anthem was clipping with 1-5% THD at 110W and the XPA-5 delivered power cleanly at 0.001% at 110W then I'll certainly concede that in such a scenario the sound should be cleaner on the XPA-5. Clipping is usually something which is difficult to ignore once you've heard it.
However if the Anthem can deliver 15W at 0.0005%, hypothetically speaking, and the XPA-5 can deliver 15W at 0.00000015% you think one can hear the difference? At 1-2W you could throw in several more leading zeros to that figure.
Amplifiers are usually measured at full output at 0.1% distortion on a test bench using sine waves. You can exceed rated power and increase distortion over 0.1% or you can decrease distortion by reducing power. 0.1% is generally considered to be low distortion - ie clean power.
I don't think there are people who can pin-point 0.1% from 0.001% at a given power output while listening to program material, but for the sake of the discussion, let's assume people can do this. Let's also assume that differences can be easily heard given all the masking effects at play. If we reduce power and corresponding SPL so that distortion is now 0.001%, do you think people can hear that?
Any amplifier, even an entry-level Yamaha sub-$400 receiver can muster at least 70W on a test bench at 0.1% THD using sine waves into 2 channels. At 50W, the distortion would be significantly reduced. At 1-2W, the distortion levels would be positively microscopic. I vehemently disagree with the idea that one can hear distortion on a microscopic level. Our ears don't have unlimited sensitivity to very small measurable changes.
Beyond a certain point it really makes no difference. Back to examples, that Mctinosh amplifier at 1.2KW would probably deliver 110W at 0.00000000000000005% and 1-2W with an additional 10 zeros tagged on.

Can one hear the difference? Or it is just a numbers game? I'm inclined to think it *is* a numbers game and nothing more.
Don't get me wrong, it
is intuitive to think that a bigger amplifier will deliver cleaner power, and I agree, all things being equal, but if the power is clean enough at your listening levels, does it actually matter if you add another leading zero to the number? Sorry for the very long post, I sometimes get carried away.
