I'm looking at a few different Receiver, mainly the Denon and Yamaha and I noticed that (especially with the Yamaha) they like to spec different models of their amplifiers with 10 Watt differences. Now 10W around 100W is negilable however it brings up my question.
How do they have designs that have 10W output differences. The size of MOSFETs an amplifier uses usually dictates how much current and therefore power the amplifier is capable of. Obviously there needs to be a just as capable power supply. I can't imagine they are using 3 different MOSFETs for 3 different receiver models that each give a 10W difference. I'm guessing that they use the same FETs for all or most of their models and either intentionally limit the maximum current at different values for different models or use really use a smaller power supply to save cost and just spec the amp a little lower even though the power stage is just as capable of larger models even though the power stage is identical.
I do realize some models really do use larger power supplies and FETs, however I am mainly referring to models that are very similiar to each other. For example I think the Yamaha Rx-V550, 650, 750.
Any thoughts?