Good point, I had never thought of if that way. Difficult to Google and come up with much about this (most of what comes back pertains to speaker sensitivity), but I did dig
this up, which indicates that I was stating it properly:
Regards,
Wayne A. Pflughaupt
It depends on the original purpose of the amp. It used to be that line level source devices for Pro Audio use usually went to 0VU and that's it, often with a fixed output level. The preamp would provide enough output to satisfy the power amp's sensitivity (for full output) and the power amp had no sensitivity controls. Then, people began to mix & match pro and consumer grade equipment and the standards became blurred, with "Pro-sumer" equipment coming out with RCA and TRS input/output jacks instead of XLR. In the original form, pro audio equipment was clean, quiet and it reached full power with no problem because
it was designed to work together.
Car audio became increasingly attracted to their beloved dB contests and while 1+V output from a head unit was fine when people just wanted a good sounding stereo, they decided that 2V was better because the amp's input sensitivity could be decreased using the control and it would also decrease the noise level. Then, they upped the head unit's output even more and pretty soon, 4V output was hte norm, instead of an extreme. 4V will overdrive most power amp inputs if they're fixed and some won't handle it even if they're variable.
If you want to google audio amp specs, look for NAB (National Association of Broadcasters) data. AES also should have info. Most amplifiers include the input voltage needed to develop rated output.
With all of the free software out there (TrueRTA is one of them), I'm surprised more people don't use the oscilloscope function fro setting up their systems. They can measure the output voltage from sources and preamps, then look at the waveform after each processor stage and even the output to the speakers. That way, there's no doubt about whether the system is distorting.