The input for speaker sensitivity is already expressed as 2.83V@8 Ohms as being 1 Watt, but not all correct it when the load is 4 Ohms.
Exactly, that's one of the reasons it should have been specified in current and voltage rating instead of power (Watts). That way we just have to know the minimum impedance of the speaker and we can figure out if the said power amplifier is rated for the needed voltage and current during the impedance peaks and dips and now worry about the Watts.
The way it is now is not easier for consumers to understand, it seems to be easier only because it has been forced fed by the manufacturers from day one. Ask someone how much power does a Revel P208 consume you are going to have all kinds of answers. How many would actual know the answer is, it depends, but then on what?? And that assumes the average consumer do know what a watt is, to begin with.
They need to show the specs in a way that the average child will understand.
If the average child understand even just Ohm's law, he/she should be able to tell an amp rated for:
Max output voltage: 48 V, at <= 10 A per channel
Max output current: 16 A, at <= 30 V per channel
should have no trouble driving a speaker with specs like 4 ohms nominal, 2 ohms minimum, recommended power 50 to 400 W (but then again, it should have been rated in voltage and current too).
Still, applying Ohm's law, 30/2 = 15 A, and so the amp can deliver the current the speaker needs. Since it is also rated 48 V, 10A, it clearly has high enough voltage for the said speaker too.
If the same amp is rated in power output such as 480 WPC then people will end up arguing, some will say yes, some will say no because the speaker has impedance dips below 4 ohms, and some high phase angle etc blablabla...