Say you have a speaker of 6 ohms nominal impedance with a 88 db sensitivity. How would that compare to one of 8 ohms impedance and 91 db sensitivity?
My logic tells me it will be potentially just as loud if not louder (because it's an easier load to push, and the amp will be able to unleash more power, hence similar dB's despite the lower sensitivity) but draw more current through the amp and so put more strain on it (so it will run hotter).
Some very (overly?) simplistic calculations. Say we test with two pairs of speakers, one is a very efficient 8 ohm load - 92 db/w @ 1m, another is a 6 ohm load - only 86 db/w @ 1m.
Let's call loudness L, sensitivity S, power P, impedance R.
If we keep the current constant (power is irrelevant, remember, current is!), an amp that is rated at 95w RMS will be doing putting out:
I = sqrt(P/R) = sqrt(95w/8ohm) = 3.4 amps
So that same amp could push a 6 ohm load with:
P = I^2*R = 11.875 * 6 = 71w
On the other hand, most amps are rated as pushing about 150% their 8-ohm power-into 4 ohms. That means MUCH higher current.
Any comments on the above?
