Since figures seems to be hard to come by, I recently came up with what I think/hope is a reasonably viable method to measure the peak voltage output of a receiver or pre-amp. Maybe the more electronically inclined here can comment.
It’s based on the premise that a 0 dBFS signal is the hottest that we’ll ever get from our digital programming media and playbackcomponents.
The first thing needed is a 60 Hz sine wave signal. It shouldn’t be hard to find sine waves on line somewhere; the BFD Guide at the Home Theater Shack Forum has a link to some.
That’s where I got the sine waves I downloaded, but the level was lower than 0 dBFS. So, I used an audio editing program to increase the level to 0 dBFS (I actually boosted mine to not quite that much: ¼ dB below 0 dBFS). I also used the audio editor to lengthen the sine wave signal to one minute, instead of the 10-15 seconds it originally was.
Next, burn your 60 Hz, 0 dBFS test signal to a CD, so you can play it back through your system. Set your receiver or preamp for basic stereo sound, and bass management for full-range speakers. With my receiver, the only level adjustment in the menu is for the center and surround speakers – i.e., their levels are adjusted around the fixed level of the L/R speakers. Don’t know if this is the case with all receivers and pre-pros, but if your menu has level adjustments for the main left and right speakers, they should be trimmed to the maximum level for this.
Disconnect your speakers, turn the volume all the way up, and start the test signal. At this point you can measure the voltage output of the preamp with a standard VOM set for AC voltage. I suggested the 60 Hz signal because this should get the most accurate measurement; I’m told that VOM’s are typically calibrated @ 60 Hz.
Using this method, I measured 4.4 volts from my Yamaha RX-V1’s preamp outputs, and 8.8 volts from the mono subwoofer output.
I’m no electronics techie so I don’t know how this measurement method compares to vRMS, but it should give you a good idea as to whether or not your receiver can drive a pro-grade amp. Basically compare your voltage reading to a prospective amp’s sensitive spec, which is typically given as volts or volts RMS. My measurements showing a fairly high voltage output for my receiver appears to correspond to the sensitivity rating of the Carvin amp I’m using. It’s rated for 1 volt sensitivity, and I’m running it with the gain controls set below the half-way point.
As noted, I’d appreciate feedback from our electronics-savvy members.
Regards,
Wayne A. Pflughaupt