Guys I wish I had more time to contribute to this thread, but I am preparing for some travels to cool companies which I will be writing articles on shortly
I just wanted to point out a couple of things:
1) The S&V as well as most magazine power tests are usually conducted with the line voltage held constant. In some cases they even replace the power fuse on the receiver with a higher rated one to do the test. This is a very unrealistic test condition. But the advantage is repeatability of testing and of course bloated power figures. We don't test with the line voltage held constant but always monitor it to ensure the line isn't dropping below 115Vrms during our testing.
2) Their tests are typically with an automated script on their audio analyzer to plot power vs distortion. While this is a useful test to show the max limits of the amp, it is not a continuous test signal so real world numbers will be a bit lower.
3) Their power #'s are usually into clipping at 1%. I really think power tests should be conducted with no more than 0.1% THD. IMO, testing into clipping is like a guy claiming he can bench 405lbs but needs a spotter to lift the weight off the rack and keep his fingers on the bar while he does a single rep. A true test of strength is the ability to lift the weight on your own more than one time, at least that’s how I feel about it.
You really have to know if the tested are conducted in the same manner each time. Usually its poorly documented such was the case several years ago when S&V forgot to hold the line voltage constant when they mistakenly measured a Denon AVR-4802 to put out more power than a Yamaha RX-V1. I am not pointing fingers at S&V, as much as I am trying to convey not putting too much weight on a power measurement that varies 10-20 watts or so between products even if tested by the same publication and/or reviewer.
4) Power consumption on the back of a receiver is usually not a max rating. It is usually a rating per UL to test two channels at full power and all remaining power at 1/8 max.
As I have stated many times in the past, and others in this thread have also; “All Channels Driven” is an unrealistic, worst case test condition into a best case test load. It’s not at all representative of what happens in real world home theater or music. What you should pay attention to is frequency response uniformity, signal to noise ratio, output impedance, and distortion at various power levels and impedances. The amp should have a robust enough power supply to tap full power into 2 or 3 channels with a continuous white noise pulse, or even a steady state sweep, but also have enough headroom for peak power demands similar to the IEC specification for dynamic power.
In closing realize that in today’ home theater realm, speakers are typically 87dB @ 1 watt/meter or greater with relatively constant impedance profiles not dipping below 4 ohms. In addition, the typical user usually has all speakers set to small with a HPF 12dB/oct slope with the bass being redirected to a dedicated subwoofer or two. If the speakers impedance dip is at low frequencies, no worries since they will be crossed over via the bass management of the receiver. If the dip is at high frequencies, really not of much concern since above 8kHz, music is mostly harmonic in nature thus very little power demand to the amplifier. Most of the power demand in home theater is in the bass where you have high power subwoofers handling it.
Only in the largest installs with very low efficient speakers and high SPL demands do you need ultra high powerful amplifiers. In most cases a mid to high end receiver will have enough power to deliver home theater with aplomb.