The closest I got to understanding electical matters was when my ancient high school science teacher related a garden hose to electrical processes in describing voltage, amperage, etc. So pardon my continued ignorance.
I have recently been told that with modern a/v amps it's high CURRENT that makes for the good ones...watts are relegated to secondary importance. Was I being hustled, or is this true? If it is true, why is wattage the principle mainstay of specs on receivers/amps?