Amplifier Voltage Gain matching your Preamp and Amplifier

A

admin

Audioholics Robot
Staff member
Ever look at an amplifier spec sheet and wonder what the Voltage Gain and Input Sensitivity specs mean? Amplifier Voltage Gain is and Input Sensitivity are important factors in determining how nicely the amplifier in question will mate with a particular preamp. Our article discusses this topic in detail and helps you make some simple calculations to make a more informed purchasing decision.



Read our Amplifier Voltage Gain Article. What preamp/amp combo are you running? Discuss.
 
Last edited by a moderator:
internetmin

internetmin

Audioholic
Forget the math for a sec since all the engineers love the math :). If you have a THX certified pre-pro and a THX certified amp, shouldn't there be match in that instance? Would that be an argument for getting THX-certified gear since the gain and impedance are already set and pre-determined values?
 
slipperybidness

slipperybidness

Audioholic Warlord
Forget the math for a sec since all the engineers love the math :). If you have a THX certified pre-pro and a THX certified amp, shouldn't there be match in that instance? Would that be an argument for getting THX-certified gear since the gain and impedance are already set and pre-determined values?
I think that most of the AHers don't put a lot of faith in the THX cert. Just another emblem and costs more, while there are plenty of more-capable products on the market that aren't cert'd.
 
A

avengineer

Banned
Wow, there's a lot wrong with that article.

It implies there are actually preamps in this world that clip a 1 Vrms. I don't even know how you could design such a thing without doing it deliberately, which nobody would, nor has for the past 30 years at least. Clipping at 1Vrms would be clipping at 2.83v peak to peak, which would imply a preamp output stage supply running at somewhere around 6V, or bipolar 3V, or some kind of output pad. Nope, don't think so. Clipping at 1 volt is just a really exaggerated example, never happens in real life.

Next we have the noise issue. Again, in real life, since we do have more than 1Vrms at the pre, and only need a little less than that to drive the power amp, where's the noise problem? From there we go to this sentence, "Besides noise configuration, an increase in amplifier gain will decrease in the bandwidth (BW) of the circuit, meaning some valuable data may get eliminated from the input signal (the amplifier works as a filter)." Increasing the gain of an amplifier will decrease bandwidth, but the input gain controls of, say a QSC amp (or any other) do not actually change the internal gain of the amp! Rather, the control acts as an input attenuator, permitting overall system gain adjustment without changing bandwidth of the amp. Sorry, there'll be no amplifier bandwidth changing with the gain control.

Then, "Additionally, having a high gain amplifier may introduce DC offset at the output. In an amplifier with high input impedance, increasing the gain will introduce a DC offset which affects the operating point of the circuit (changes the balance of the amplifier)." Yes, this might be true if we were dealing with the very most raw and basic amplifier building blocks, but that's not what a power amp is. Again, internal gain is fixed, only input attenuation changes. DC offsets are a really bad thing, and in any modern and well designed amp, any DC offset has been eliminated by some means, like a coupling capacitor in the right place, active DC servo, etc. There's simply nothing available to the user that could possibly alter the DC offset or operating point or balance of the amplifier's circuit.

Moving to the discussion of load impedance, "There is naturally a big difference between rating voltage output on an open circuit, i.e. no load, versus 600 ohms, which is likely to be a considerably tougher task than most amplifiers you’re likely to meet, which have input impedances on the order of tens of thousands of ohms. Rating open circuit doesn’t take into account potential current limits which could bring on preamp clipping much sooner than you might expect once you introduce real world conditions including the cable impedance."

I have no idea why a 600 ohm load would even be mentioned, as it never occurs in the real world as a preamp load. Never. Never has either. The paragraph is correct in that amplifiers have input impedances of 10K or higher, which is, essentially, an open circuit to the preamp. So what "potential current limits" are we then talking about? What's the load that brings the onset of clipping down? And What on This Earth are we talking about "cable impedance" for? Unless the cable between the preamp and power amp is several miles long, the characteristic cable impedance is a complete non-factor. And if we did have a 10 mile long preamp cable, the cable impedance would be the least of your worries, because now we'd have to equalize the cable, and the preamp isn't up to that task without external help. None are. But it's really silly to cite such a thing as a problem that could potentially increase the possibility of clipping! Just so we are clear, there is nothing about a power amp input impedance, or cable impedance that can overload a preamp output such that the clipping point is reduced!

In the concluding paragraph, we have this gem, referring to voltage gain, "However, this little detail can be the difference between a truckload of distortion or noise and nice clean sound." Talk about voltage gain! A "truckload"? Really??? This article has amplified a tiny issue by 20dB at least into something huge, major, and catastrophic. It is none of that. Nearly every preamp output will drive nearly every power amp to full output.

I didn't mention the speaker load paragraph, it's sort of out of context of a voltage gain discussion.

I appreciate the attempt, but lets try to provide accurate and relevant information to the unsuspecting hobbyist instead of alarmist and inaccurate reporting.
 
Last edited by a moderator:
RichB

RichB

Audioholic Field Marshall
^^^
Ouch. Something tells me this thread could get interesting :)

- Rich
 
A

avengineer

Banned
I think that most of the AHers don't put a lot of faith in the THX cert. Just another emblem and costs more, while there are plenty of more-capable products on the market that aren't cert'd.
This is correct, but lets not totally devalue the THX badge.

To earn THX certification an audio device has to pass a battery of tests that take several days to run and result in a document about 250 pages long. The passing scores were chosen such that the device would be able to present audio "as the creator intended", in a room of a particular size in volume. Some of those specs were easy, but some, such as dynamic headroom in an amplifier, increased manufacturing costs significantly. Because of the level of detail of the certification tests, many products that were designed as THX capable devices didn't pass the initial cert tests, and had to be modified and re-tested. But since THX doesn't test prototypes, that also means that final designs that come of an assembly line may have to be changed, and that is expensive.

The device that has a THX badge did in fact cost more, sometimes a lot more, to make. And the result was and is a certain guaranteed quality level. But things at THX went a bit wonky. Several things happened that make THX certification today less valuable to a manufacturer.

First is the consumers confusion about that a THX is. THX didn't help this at all by broadening that it certified, then creating promotional trailers that said things like "Let's hear it in THX!", as if THX was some sort of format. They also sub-divided their certifications, which made sense because it would bring the THX badge to a lower cost product. But then it was sub-sub-divided into Select Select2 Ultra Ultra2 Ultra2 Plus. And that's just audio equipment, the badge is on cables (!) displays, microphones, even drywall. When something confuses consumers they start to ignore it and go for what they know. And that has spelled big trouble at THX.

And then the patents expired. Now, anyone can use technology unique to THX, things like Re-EQ (an area so confused now as to be almost pointless), mono surround decorrelation (remember the last time you had a mono surround track?), etc. Moving forward, THX has introduced a whole family of new technologies, some good, some bested by others. But to use them, you do have to pass certification, so they've tried to stimulate new interest. And since manufactures have an easier time building better components today, it's not so hard to make a THX-like performer. But would a consumer pay more for it? The answer is moving ever closer to generally "No". Since consumers have lost interest, so have manufacturers. Denon has no THX product in their line, yet just half a decade ago there were several products. Pioneer and Onkyo still do at the upper end of their AVR lines. And there are THX speakers (thankfully!) out there still.

The badge does mean "Quality" still, and it does cost more to make something that passes even the lowly Select level. Yes there is a lot of fine product out there that isn't THX certified, and some might even pass a certification test at least partially. But if a consumer doesn't want to dig deeply into the specs, and understand their application to his, picking a THX product at least guarantees a certain level of performance in a single parameter: the badge. It does mean something, just not to as many people as hoped.
 
gene

gene

Audioholics Master Chief
Administrator
Wow, there's a lot wrong with that article.

It implies there are actually preamps in this world that clip a 1 Vrms. I don't even know how you could design such a thing without doing it deliberately, which nobody would, nor has for the past 30 years at least. Clipping at 1Vrms would be clipping at 2.83v peak to peak, which would imply a preamp output stage supply running at somewhere around 6V, or bipolar 3V, or some kind of output pad. Nope, don't think so. Clipping at 1 volt is just a really exaggerated example, never happens in real life.

Next we have the noise issue. Again, in real life, since we do have more than 1Vrms at the pre, and only need a little less than that to drive the power amp, where's the noise problem? From there we go to this sentence, "Besides noise configuration, an increase in amplifier gain will decrease in the bandwidth (BW) of the circuit, meaning some valuable data may get eliminated from the input signal (the amplifier works as a filter)." Increasing the gain of an amplifier will decrease bandwidth, but the input gain controls of, say a QSC amp (or any other) do not actually change the internal gain of the amp! Rather, the control acts as an input attenuator, permitting overall system gain adjustment without changing bandwidth of the amp. Sorry, there'll be no amplifier bandwidth changing with the gain control.

Then, "Additionally, having a high gain amplifier may introduce DC offset at the output. In an amplifier with high input impedance, increasing the gain will introduce a DC offset which affects the operating point of the circuit (changes the balance of the amplifier)." Yes, this might be true if we were dealing with the very most raw and basic amplifier building blocks, but that's not what a power amp is. Again, internal gain is fixed, only input attenuation changes. DC offsets are a really bad thing, and in any modern and well designed amp, any DC offset has been eliminated by some means, like a coupling capacitor in the right place, active DC servo, etc. There's simply nothing available to the user that could possibly alter the DC offset or operating point or balance of the amplifier's circuit.

Moving to the discussion of load impedance, "There is naturally a big difference between rating voltage output on an open circuit, i.e. no load, versus 600 ohms, which is likely to be a considerably tougher task than most amplifiers you’re likely to meet, which have input impedances on the order of tens of thousands of ohms. Rating open circuit doesn’t take into account potential current limits which could bring on preamp clipping much sooner than you might expect once you introduce real world conditions including the cable impedance."

I have no idea why a 600 ohm load would even be mentioned, as it never occurs in the real world as a preamp load. Never. Never has either. The paragraph is correct in that amplifiers have input impedances of 10K or higher, which is, essentially, an open circuit to the preamp. So what "potential current limits" are we then talking about? What's the load that brings the onset of clipping down? And What on This Earth are we talking about "cable impedance" for? Unless the cable between the preamp and power amp is several miles long, the characteristic cable impedance is a complete non-factor. And if we did have a 10 mile long preamp cable, the cable impedance would be the least of your worries, because now we'd have to equalize the cable, and the preamp isn't up to that task without external help. None are. But it's really silly to cite such a thing as a problem that could potentially increase the possibility of clipping! Just so we are clear, there is nothing about a power amp input impedance, or cable impedance that can overload a preamp output such that the clipping point is reduced!

In the concluding paragraph, we have this gem, referring to voltage gain, "However, this little detail can be the difference between a truckload of distortion or noise and nice clean sound." Talk about voltage gain! A "truckload"? Really??? This article has amplified a tiny issue by 20dB at least into something huge, major, and catastrophic. It is none of that. Nearly every preamp output will drive nearly every power amp to full output.

I didn't mention the speaker load paragraph, it's sort of out of context of a voltage gain discussion.

I appreciate the attempt, but lets try to provide accurate and relevant information to the unsuspecting hobbyist instead of alarmist and inaccurate reporting.
The same could be said about jumping to conclusions and making unsubstantiated claims in a forum post to discredit this article when you have no data to back it up.
1. I have measured MANY receiver preouts that clip at 1Vrms either b/c of the limitations of the Volume controls or b/c the preamp was using a single +5V rail for the opamps.
2. There are esoteric amps that have 600 ohm input impedances. I don't agree with making a design like this, but they exist.
3. Cable impedance DOES matter depending on the input impedance of the amp and how esoteric the actual cable is (capacitance being the most concerning parameter). A preamp driving an amp with a 50k input impedance connected to an interconnect cable of 1000pF would yield a -3dB loss at 3.1kHz. This is certainly NOT acceptable. So we prefer a lower capacitance cable of course!
 
Last edited by a moderator:
gene

gene

Audioholics Master Chief
Administrator
This is correct, but lets not totally devalue the THX badge.

To earn THX certification an audio device has to pass a battery of tests that take several days to run and result in a document about 250 pages long. The passing scores were chosen such that the device would be able to present audio "as the creator intended", in a room of a particular size in volume. Some of those specs were easy, but some, such as dynamic headroom in an amplifier, increased manufacturing costs significantly. Because of the level of detail of the certification tests, many products that were designed as THX capable devices didn't pass the initial cert tests, and had to be modified and re-tested. But since THX doesn't test prototypes, that also means that final designs that come of an assembly line may have to be changed, and that is expensive.

The device that has a THX badge did in fact cost more, sometimes a lot more, to make. And the result was and is a certain guaranteed quality level. But things at THX went a bit wonky. Several things happened that make THX certification today less valuable to a manufacturer.

First is the consumers confusion about that a THX is. THX didn't help this at all by broadening that it certified, then creating promotional trailers that said things like "Let's hear it in THX!", as if THX was some sort of format. They also sub-divided their certifications, which made sense because it would bring the THX badge to a lower cost product. But then it was sub-sub-divided into Select Select2 Ultra Ultra2 Ultra2 Plus. And that's just audio equipment, the badge is on cables (!) displays, microphones, even drywall. When something confuses consumers they start to ignore it and go for what they know. And that has spelled big trouble at THX.

And then the patents expired. Now, anyone can use technology unique to THX, things like Re-EQ (an area so confused now as to be almost pointless), mono surround decorrelation (remember the last time you had a mono surround track?), etc. Moving forward, THX has introduced a whole family of new technologies, some good, some bested by others. But to use them, you do have to pass certification, so they've tried to stimulate new interest. And since manufactures have an easier time building better components today, it's not so hard to make a THX-like performer. But would a consumer pay more for it? The answer is moving ever closer to generally "No". Since consumers have lost interest, so have manufacturers. Denon has no THX product in their line, yet just half a decade ago there were several products. Pioneer and Onkyo still do at the upper end of their AVR lines. And there are THX speakers (thankfully!) out there still.

The badge does mean "Quality" still, and it does cost more to make something that passes even the lowly Select level. Yes there is a lot of fine product out there that isn't THX certified, and some might even pass a certification test at least partially. But if a consumer doesn't want to dig deeply into the specs, and understand their application to his, picking a THX product at least guarantees a certain level of performance in a single parameter: the badge. It does mean something, just not to as many people as hoped.
THX has become quite a watered down proposition IMO. Their certification for speakers/subs is pretty easy to pass. Our "Large" bassaholic rating far exceeds anything THX does, not to mention getting our Extreme rating.

As for amp testing, that again is a pretty easy thing to pass for THX. Hence why even the less than stellar ICE models Pioneer used to use received Ultra2 certification.

As for Blu-ray players, well, read this:
Lexicon BD-30 Blu-ray Player (Oppo BDP-83 Clone) Review | Audioholics

But hey, THX desktop speakers and speaker cables are cool :)

From an interoperability standpoint, I guess there is still some value in that but its rare to find legitimately well designed gear without the THX logo to not play well together.
 
A

avengineer

Banned
A few thoughts:


I don't think by giving a simple example of 1Vrms that there's an inherent implication that many (or even any) preamps are clipping at 1Vrms (though that's subject to interpretation); in fact it's expressly mentioned that AH has measured a couple of midrange receivers (pretty much the level you'd have to step up to in order to actually get preouts in the first place) capable of considerably more unclipped output.


Citing an example implies "real world" to anyone who is unable to see clearly that it's an exaggeration. "
Suppose you have a receiver that can deliver 1 volt RMS from its preamplifier outputs before clipping; if you pair this receiver with a high powered amplifier expecting a huge boost in headroom, you might be sorely disappointed if its voltage gain is a below average 27dB." If the example wasn't meant to be taken seriously, then why write it with the phrase "sorely disappointed"? This is alarmist writing, plain an simple.

OTOH, going back to the XPA-2 review:

I'll tend to trust Gene on his experiences measuring equipment.


Sorry, I can't trust that statement unless the reference is to a rare case of really horrible and uninformed design work. Someone placing a preamp output on a device knows what it's for and what it is to drive. Placing a clipping point at 1Vrms is just simply so completely wrong as to be unbelievable, except for, like I say, a really horrible design...which probably has no business being used as a preamp in the first place.

(Re: noise issue)
It can crop up when you pair an amplifier with high gain (such as the XPA-2) with a fairly high sensitivity loudspeaker (say a Klipschorn). For any given output level, the output from preamp is going to be considerably less than typical, and consequently more prone to noise issues.



In your example of the QSC GX3, with voltage gain of 32.2dB, an internal noise floor stated at -100 (no reference, but lets assume worst case of clipping as the reference at 300W). With the input to the amp terminated (no preamp, just terminated with a resistor) we have the noise floor of the amp at -100dB re: 300W, or 75.3dB below 1W. With Klipsh speakers connected with specified sensitivity of 97dB 1w/m, we land on a quiescent noise floor of 21.7dB SPL@ 1 meter (unweighted), with just the amp alone. With a "reasonable" preamp, could we expect a signal to noise ratio at the output worse than, say, a CD? Not usually, and that would put the preamp noise perhaps just slightly higher than the amp, and raise the noise floor slightly. But, we're already at 21.7dB SPL unweighted. Apply a weighting filter and that will drop between 6 and 12dB. None of it matters because any room will be acoustically noisier than that. There may, as before, be some really horrible preamp outputs, but they aren't going to be the norm. Odd that on current AVR product you don't even find preamp outs on entry level product.



The article isn't talking about fiddling with a gain knob.


Oh, really? Then what was the intent of this paragraph?

"
Besides noise configuration, an increase in amplifier gain will decrease in the bandwidth (BW) of the circuit, meaning some valuable data may get eliminated from the input signal (the amplifier works as a filter). Additionally, having a high gain amplifier may introduce DC offset at the output. In an amplifier with high input impedance, increasing the gain will introduce a DC offset which affects the operating point of the circuit (changes the balance of the amplifier)."

Why would any consumer be concerned with this fact unless he had the ability to adjust the actual amp gain? Why mention it at all, if gain is fixed internally? And if you weren't talking about adjusting the front panel knobs, you should have. It's a perfect opportunity to gain-match a preamp and a power amp!

By the way, in a properly designed amplifier, if actual gain were variable, and there have been some preamps like this, they are designed with careful consideration to gain bandwidth product such that at maximum gain they still have more than ample audio bandwidth. Mentioning the variation in bandwidth with gain is simply highlighting a non-problem, and very alarmist writing.



Probably because it's an example that exceeds any potential load that a preamp should see, i.e. being conservative.

Citing cable impedance in the context of an audio interconnect is absolutely inappropriate. It's not an appropriate example as nobody will ever tie 10 miles of wire between their preamp and power amp. Cable characteristic impedance becomes a factor when the length of the wire becomes longer than 1/4 of the wavelength of the signal on the wire. A 1/4 wave at 20KHz is over 3 miles (once adjusted for cable propagation velocity), and would become a factor if we were dealing with an impedance matched transmission line. That's not at all what a 10' interconnect is. Again, the citation of cable impedance is inappropriate, unimportant, and citing it is simply highly alarmist.


Under a worst case scenario,i.e. excessive noise or clipping the preouts of a bottom feeder receiver, sure.


True, fine, and why then didn't you spend some time helping people identify specifically which AVRs would be classified as a bottom feeder receiver is so they can be avoided? That would actually be helpful information. Something like, AVR "A" is a bad combination with Power Amp "B" with speakers "C", so that combination could be avoided. You've focussed on extreme examples that bring the voltage gain problem to near the surface, like a very low output noisy preamp, a low or high gain power amp, and some the most efficient speakers in the industry. How about a few normal, real-world examples? Or wouldn't that scare people enough?

All you're doing here amplifying minor issues to an unrealistically high level and alarming the uninitiated to obsess about something they have no ability to deal with. This is where myth begins and propagates.

And then you miss the one control that just might help mitigate one of the very few preamp/power amp match issues: the amps own front panel gain controls.

I stand by what I said, the article is poorly done, contains technical inaccuracies, and inappropriate examples. If it were just one item, like a misstatement of the role of cable impedance, I'd probably not say much. But this one is loaded.
 
Last edited by a moderator:
A

avengineer

Banned
The same could be said about jumping to conclusions and making unsubstantiated claims in a forum post to discredit this article when you have no data to back it up.
1. I have measured MANY receiver preouts that clip at 1Vrms either b/c of the limitations of the Volume controls or b/c the preamp was using a single +5V rail for the opamps.
You know what would be helpful then? List them!
2. There are esoteric amps that have 600 ohm input impedances. I don't agree with making a design like this, but they exist.
So, someone with an entry level AVR that clips it's pre out a 1V is driving an esoteric power amp with a 600 ohm input? Yea, it's possible, just not very probable.
3. Cable impedance DOES matter depending on the input impedance of the amp and how esoteric the actual cable is (capacitance being the most concerning parameter). A preamp driving an amp with a 50k input impedance connected to an interconnect cable of 1000pF would yield a -3dB loss at 3.1kHz. This is certainly NOT acceptable. So we prefer a lower capacitance cable of course!
Nope, you've got it wrong. The cable C will be swamped by the output impedance of the preamp, probably 1K or less. The input impedance of the power amp at 50K has interaction with the cable C. So, I have two problems now. First, your example doesn't work electrically, second, an example with a 3db loss at 3.1kHz is again alarmist, and not real world.

C'mon, guys. Lets work with reality.
 
gene

gene

Audioholics Master Chief
Administrator
You know what would be helpful then? List them!

So, someone with an entry level AVR that clips it's pre out a 1V is driving an esoteric power amp with a 600 ohm input? Yea, it's possible, just not very probable.

Nope, you've got it wrong. The cable C will be swamped by the output impedance of the preamp, probably 1K or less. The input impedance of the power amp at 50K has interaction with the cable C. So, I have two problems now. First, your example doesn't work electrically, second, an example with a 3db loss at 3.1kHz is again alarmist, and not real world.

C'mon, guys. Lets work with reality.
You are free to read my prior Receiver reviews that I bench tested the preamp outs. All of the info is readily available on this site.

As for the cable impedance, I have actually measured this phenomenon when designing audio communication systems for the government. In order to meet their isolation requirements for TEMPEST, a filtered connector had to be inserted between two preamp stages of the design. Despite I designed the drivers to be flat to 20kHz, I was measuring a roll off around 3kHz and couldn't realize why until I looked at the impedance of the filtered connector driving an opamp with a 10kohm input impedance.

While I love text book theory, real world also makes a difference. After measuring and designing audio equipment for over a decade, I go by my experiences and the merits of Steve's article is sound. I suggest you go do a bit more digging rather than sighting sophomoric transmission line theory.

You may also be interested in noting one of the folks that peer reviewed this article (Hadi) is a PHD student at my local University I graduated from who is well versed both in analog circuit design and microwave engineering.
 
Steve81

Steve81

Audioholics Five-0
Citing an example implies "real world" to anyone who is unable to see clearly that it's an exaggeration...Sorry, I can't trust that statement
Again, have to disagree, especially when real world examples are later given. The example is (as should be fairly obvious given the meager capability 1Vrms represents) meant to demonstrate the issue, nothing more, nothing less. Of course that Gene has measured receivers whose preouts aren't capable of much more is just icing on the cake, regardless of whether you trust his statement or not.

With a "reasonable" preamp

There lays the problem: sans somebody actually bench testing your receiver, how do you know its preouts qualify as "reasonable"?


True, fine, and why then didn't you spend some time helping people identify specifically which AVRs would be classified as a bottom feeder receiver is so they can be avoided?
That's rather the point of reviews here, no?
 
Last edited by a moderator:
A

avengineer

Banned
You are free to read my prior Receiver reviews that I bench tested the preamp outs. All of the info is readily available on this site.

As for the cable impedance, I have actually measured this phenomenon when designing audio communication systems for the government. In order to meet their isolation requirements for TEMPEST, a filtered connector had to be inserted between two preamp stages of the design. Despite I designed the drivers to be flat to 20kHz, I was measuring a roll off around 3kHz and couldn't realize why until I looked at the impedance of the filtered connector driving an opamp with a 10kohm input impedance.
Interesting, and a specific case I'm sure. You cited no details, but I have no reason to question your observations in that case, though I'm not sure of why it's relevant.
While I love text book theory, real world also makes a difference. After measuring and designing audio equipment for over a decade, I go by my experiences and the merits of Steve's article is sound. I suggest you go do a bit more digging rather than sighting sophomoric transmission line theory.
Congratulations on your 10 year audio career, and I'm not being sarcastic, I mean it. I'm sorry you feel the need to cite your history as a means to reinforce your credibility. I'm not taking issues with your experience. I'm taking issue with the article. I'm not going to bother citing mine, this isn't a p---ing contest.

I'm not sure why transmission line theory is "sophomoric" to you, but I didn't bring it up, Steve did when in mentioned cable impedance as a factor that would overload a preamp output to clipping. Had he (more correctly) used the term "cable capacitance", I would not have cited transmission line theory, which is where the characteristic of cable impedance comes into play. Yes, capacitance is a part of the complex impedance of a cable, but so is inductance and resistance, both of which are so small as to be non-factors in short cables. So we should be and are concerned with cable capacitance. I would still probably have objected, though, because no practical cable capacitance can overload a preamp output to the point it affects clipping level.

Your example:
"A preamp driving an amp with a 50k input impedance connected to an interconnect cable of 1000pF would yield a -3dB loss at 3.1kHz. "
...is not correct. The values you used are right, but the topology is wrong for that result. The example would be correct if the source was 50K, you stated that value for the load, which won't matter. Line level and preamp outputs are not that high by a long way. If they were, there would be no way to get response flat to 20KHz through a 10' cable. What did your receiver tests show?
You may also be interested in noting one of the folks that peer reviewed this article (Hadi) is a PHD student at my local University I graduated from who is well versed both in analog circuit design and microwave engineering.
Yes, that's interesting, and is probably all the more embarrassing for the forum.
 
Last edited by a moderator:
gene

gene

Audioholics Master Chief
Administrator
Interesting, and a specific case I'm sure. You cited no details, but I have no reason to question your observations in that case, though I'm not sure of why it's relevant.

Congratulations on your 10 year audio career, and I'm not being sarcastic, I mean it. I'm sorry you feel the need to cite your history as a means to reinforce your credibility. I'm not taking issues with your experience. I'm taking issue with the article. I'm not going to bother citing mine, this isn't a p---ing contest.

I'm not sure why transmission line theory is "sophomoric" to you, but I didn't bring it up, Steve did when in mentioned cable impedance as a factor that would overload a preamp output to clipping. Had he (more correctly) used the term "cable capacitance", I would not have cited transmission line theory, which is where the characteristic of cable impedance comes into play. Yes, capacitance is a part of the complex impedance of a cable, but so is inductance and resistance, both of which are so small as to be non-factors in short cables. So we should be and are concerned with cable capacitance. I would still probably have objected, though, because no practical cable capacitance can overload a preamp output to the point it affects clipping level.

Your example:
"A preamp driving an amp with a 50k input impedance connected to an interconnect cable of 1000pF would yield a -3dB loss at 3.1kHz. "
...is not correct. The values you used are right, but the topology is wrong for that result. The example would be correct if the source was 50K, you stated that value for the load, which won't matter. Line level and preamp outputs are not that high by a long way. If they were, there would be no way to get response flat to 20KHz through a 10' cable. What did your receiver tests show?


Yes, that's interesting, and is probably all the more embarrassing for the forum.
yes you do need to model preamp, cable, and power amp to get a more accurate picture of what is happening. The point here however is cable capacitance is important and too much acts as a LPF.

The article doesn't discuss characteristic impedance or TL effects. It does talk about low impedancedrive issues some preamps can have a hard time dealing with.

Nowhere does the article mention cable impedance causing clipping.

It's convenient for you to post anonymously here and criticise other people's work but I'm done feeding the troll and I suggest you move on as well.
 
Last edited by a moderator:
A

avengineer

Banned
yes you do need to model preamp, cable, and power amp to get a more accurate picture of what is happening. The point here however is cable capacitance is important and too much acts as a LPF.
I wouldn't disagree with that, but I would add that in no case of a normal interconnect length with a normal output impedance and any input impedance would cable capacity create an LPF anywhere in the audio band, unless the cable has been designed with deliberately and unusually high C.
The article doesn't discuss characteristic impedance or TL effects.
My mistake, I mistook the use of the term "cable impedance" to mean "characteristic impedance", which it's meant to mean "cable capacitance". Perhaps if the correct term was used....
It does talk about low impedancedrive issues some preamps can have a hard time dealing with.
The sentence was: "There is naturally a big difference between rating voltage output on an open circuit, i.e. no load, versus 600 ohms, which is likely to be a considerably tougher task than most amplifiers you’re likely to meet, which have input impedances on the order of tens of thousands of ohms."

My point is, why would you focus on what is certainly a tiny segment of the power amp market that would have a 600 ohm input impedance? Yes, it's a valid point that a typical preamp output won't deal with a 600 ohm load well, but that's a load so far out of the norm, it's so completely atypical, ...well if you don't get it by now, you never will.
Nowhere does the article mention cable impedance causing clipping.
Here's the sentence:
"Rating open circuit doesn’t take into account potential current limits which could bring on preamp clipping much sooner than you might expect once you introduce real world conditions including the cable impedance." (Bold/Italics, mine)

It's badly worded at best, certainly inaccurate and potentially misleading, and at worst, completely wrong. You pick, I'm going with inaccurate and misleading. It would take a simple edit to fix it, if that's not too much trouble.
It's convenient for you to post anonymously here and criticise other people's work but I'm done feeding the troll and I suggest you move on as well.
That's what you've got now? Insults? Pardon me if I now take offense. My intent here is to alert readers of the errors in the article that is presented as authoritative. I'm not randomly sniping and trolling, my objections are specific, clear, and based in fact. I'm sorry if my anonymity bothers you, that's life. I think my post history of trying to help others with useful information speaks for itself. I thought that was the purpose of forums like this.

My anonymity is by choices after having been stung by identity thievery on other forums. With no stated identity there's nothing to steal.

But thanks for the kind words. If trying to point out error that contributes to misunderstanding, or providing more accurate information qualifies me as a "troll", I guess I'd rather be a troll than to stand by and just let the confusion continue unabated.

I'm off to Cafe Press to order my "Troll" T-shirt now.
 
Last edited by a moderator:
G

Grador

Audioholic Field Marshall
The point of the article is to point out things that you should look at and explain why. It's best demonstrated by listing unusually difficult conditions to show the effects. I really don't understand what issue you're having.
 
A

avengineer

Banned
I think my specific objections have been made clear several times, but to simplify, I find the article sensational, alarmist, and in several points just plain wrong. The disaster situations cited are not realistic, and several are not of any concern at all. There was, and still is an opportunity here to present useful information, but in it's present state the article will probably alarm and confuse the neophyte audiophile, something which is already done quite effectively by marketers. It's just not helping.
 
Steve81

Steve81

Audioholics Five-0
You've focussed on extreme examples that bring the voltage gain problem to near the surface, like a very low output noisy preamp, a low or high gain power amp, and some the most efficient speakers in the industry.
I guess my question to you is: what exactly do you expect in an article focused on voltage gain and why it (and associated specs) are worth some investigation before making an external amplifier purchase? The article is here to highlight potential problems so that people might avoid them. Worst case examples are used to drill the point home, not suggest that you need anything esoteric/expensive to achieve good performance.

I find the article sensational, alarmist, and in several points just plain wrong.
Oddly enough, I'd say the same thing about your posts on the topic.
 
A

avengineer

Banned
I guess my question to you is: what exactly do you expect in an article focused on voltage gain and why it (and associated specs) are worth some investigation before making an external amplifier purchase? The article is here to highlight potential problems so that people might avoid them. Worst case examples are used to drill the point home, not suggest that you need anything esoteric/expensive to achieve good performance.
You explained the principle just fine. The calculations were dead-on. And the most valuable point of all was that it's possible to pick out an amp/preamp combo that would result in not being able to drive the amp to full output.

A point, which I concede was a surprise to me, was that there are apparently so many preamp outputs in the world that clip at 1Vrms or perhaps a few dB above that, which is just bad design. But if that's the case (which I'm sorry I can't confirm reading the receiver tests posted here), it's worth knowing. You could have stopped right there.

The noise point was an exaggeration, the cable impedance point was inaccurate, the amplifier gain/bandwidth and DC offset points were irrelevant, the preamp load point was inaccurate and cited an unrealistic load condition...all of this and more has been clearly spelled out.

Examples...make them realistic, not extreme. Scale the example to probability...what real world chances are there that my Onkyo AVR can't drive my Emotiva amp? IF an extreme example is necessary, how about a specific to watch for, like "the BIGGIE3 AVR preamp out and the XTC amp don't work well together because the AVR output clips at 1Vrms and the XTC amp needs 5Vrms for maximum output." But, if most amps are fine with 2Vrms, and most preamp outputs clip at 2.2Vrms, say so. You gave many specific examples of preamp outs with plenty of headroom, no examples of those with clip levels so low as to be a problem.

And then, what would be REALLY nice is, measure and publish the preamp clip point and output impedance in all reviews. While you're at it, how about along with input sensitivity, measure and publish the input impedance on power amps? The reviews that Gene writes are quite complete (though still missing amp input Z tests), the other reviewers, not so much. Since I was accused of not being familiar with the technology, and told I should read the reviews, I did. Dozens of them, right here. Those figures are missing from all of them, except the ones Gene does, where he states the pre-out clip point. If preamp/amp match is such a huge deal, we need that info tested and published, both out clip and in Z. But what I suspect is, since input impedance of amps is usually found in the spec sheet, and is 50K or higher most of the time, it's viewed as unnecessary to publish in a review. But then, don't cite the one-off 600 ohm input as an example without naming it! (BTW, didn't that device have an input termination switch that would change that from 600 to something much higher?)

Oddly enough, I'd say the same thing about your posts on the topic.
Steve, I would totally expect that, and don't blame you for it. I have questioned everything I've posted, I'm not perfect or immune from error either. I'm satisfied with what I've said, and stand by it, and I'm not apologizing for it. This is NOT personal, I commend you for being active in the forum and writing articles. Articles are a valuable resource, and if done well, make the forum stand out as authoritative. That's what I'm about here. I will apologize if I seem harsh, it's just passion for the science that's been primary in my life for 45 years. I don't really care how I look here, I do care completely that audio concepts be presented accurately, without bias, and without exaggeration so that those new to audio absorb the best information first, and can hopefully filter out the nonsense when they are inevitably hit with it.

I encourage you to edit the article based on what's been discussed here. If you really feel there absolutely nothing wrong with it, and that is your option, then I'm very glad we had this exchange, and hope the readers have learned something. I would not expect me to ignore further articles, though (unless I'm banned from the forum...sort of waiting for that shoe to drop given who's involved here).
 
gene

gene

Audioholics Master Chief
Administrator
That's what you've got now? Insults? Pardon me if I now take offense. My intent here is to alert readers of the errors in the article that is presented as authoritative. I'm not randomly sniping and trolling, my objections are specific, clear, and based in fact. I'm sorry if my anonymity bothers you, that's life. I think my post history of trying to help others with useful information speaks for itself. I thought that was the purpose of forums like this.

My anonymity is by choices after having been stung by identity thievery on other forums. With no stated identity there's nothing to steal.

But thanks for the kind words. If trying to point out error that contributes to misunderstanding, or providing more accurate information qualifies me as a "troll", I guess I'd rather be a troll than to stand by and just let the confusion continue unabated.
You've "alerted" the readers at least 4-5 times in this thread about your opinions. This has gone beyond that for you now. I disagree with most of your arguments and feel you are using quotes of this article out of context to push your own agenda.

In any event, I updated the article for more clarity as follows:

Rating open circuit doesn’t take into account potential current limits which could bring on preamp clipping much sooner than you might expect once you introduce real world conditions such as esoteric amplifier designs with low input impedances. In addition, some esoteric high capacitance connecting cables can cause premature high frequency roll-off.

and this:


Of course, there is also the matter of the loudspeaker load. This is old hat if you’ve read the Audioholics article on impedance. As noted prior, adequate voltage output drive from the preamplifier to allow the power amplifier to reach full power is critical. The amplifier still needs a sufficiently stout current stage to deal with the loudspeakers complex load impedance, lest you run into voltage sag/clipping on the amplifier side. Ideally of course, an amplifier would act as a voltage source, maintaining output regardless of the load (i.e. it would “double down” into 4 ohms, and “double down” again into 2 ohms). However, few amplifiers are capable of accomplishing this feat at high drive levels.


Now quit whining and move on.
 
Last edited:
newsletter

  • RBHsound.com
  • BlueJeansCable.com
  • SVS Sound Subwoofers
  • Experience the Martin Logan Montis
Top