I can see the logic behind using w/m though
If a 90w entry levle receiver can only do 115w into 4 ohms, then the limit of that amp is decided by the max current capability, not its max voltage capability, when driving a nominal 4 ohm speaker.
It won't help you compare the sensitivity of two speakers / how loud it will get with the same amount of amplifier voltage,
But it will allow you to decide you need more "power" than you have. Power is a function of both voltage AND current. Since amps are not rated by their voltage and current delivery abilities, but instead by "power", you have to cater to the people buying your product. Not many people are going to be comfortable with specs if we said
"This amp can use 60v to push current through 32ohm loads, but only uses 20v into a 4 ohm load" <-- which btw, is the same amount of "power"!
They like the idea of "more power".
I believe in a 2.83v/1m sensitivity, and i don't believe in nominal impedance. An impedance and phase graph is way more useful in finding the right amp for the job.