The amplifer classes relate to how output transistor biasing (feedback) is implemented. In the past, inefficient classes like A or even A/B were necessary to control distortion to inaudible levels. The other classes of amplifiers were available but not normally implemented in high fidelity sound reproduction. They were common in RF amplifiers, as an example.
In recent years, they've made some technological improvements that have brought distortion under control with even the very efficient classes of amplification. As an example, my Pioneer receiver uses class D amps which are very efficient, producing a really cool running yet powerful output stage. But the distortion is still inaudible - just as low as if it were an A/B amp. D amps have been used in high fidelity sound applications for quite a while actually and are being used more and more.
G and H are even more efficient than the D amps because they modulate the rail voltage making it "infinitely variable." The D amps are on and off devices. They are sometimes referred to as "switching amps." The G and H amps vary the voltage as the input signal rises and falls in level. The class A amps, in contrast, run flat out all the time. You can warm a room with a big one.
The more efficient the amp, the less power it consumes, the less heat it generates, the smaller power supply it requires. The only problem has been that distortion rises with efficiency and other amplifier characteristics get worse for reproducing sound. Now that is becoming less and less of an issue with modern technology. I think amps will continue to become more efficient going forward and I would view it as a positive thing.