* Why does the manufacturer need to reduce the rail voltage more than required to keep the power rating steady?
I believe they did it on as required basis. The issue is not so much "power". In my opinion, It was unfortunate that in the beginning, manufacturers started rating their amps by "power" output instead of voltage and current. It is now too late to change.
Loudspeakers are sort of voltage devices, their sensitivities are often rated as X dB/1W/1m but should be more appropriately rated X dB/2.83V/1m. 2.83 V is picked because for the popular 8 ohm loads, it would be equivalent to X dB/1W/1m. The fact is, if you apply a voltage signal to the loudspeaker's terminals, it would make sound in proportion to the applied voltage, not to the "power input" as such. That is, the speaker may only consume only 0.1 watt in one moment but make a very loud sound, yet may consume 0.2 W in another moment and make a much quieter sound.
If an amp is rated 100 W into an 8 ohm resistor, then it can be rated 50 W into a 4 Ohm resistor safely, and that means the output voltage has to be reduced by half in order for the current to be the same. The current would be 4.8 A. (using your 38.47 V, 8 Ohm example).
Now if the manufacturer knows their amp can be rated higher than 4.8 A, then they wouldn't have to lower the rail voltage as much for the 4 Ohm setting, may be lowering it by 30% is enough, just an example. That's why the lower rail voltage would vary depending on the specific amp's rated voltage and current capability. So far so good? And you can see why I said amps should have been more appropriately rated for their voltage and current limits, than just "power" that by itself actually makes little sense? By the way, keep in mind, you really don't know how much "power" your speaker would actually consume, all you know is how much current it draws on moment by moment basis, as a good portion of the so called "power" would be consumed, or dissipated in the amp itself!!
Would dropping the rail voltage from 38.47V (in the following example) to 33.3V significantly increase power loss in the form of heat? If so, does this relate to the transformer's efficiency at different voltage?
No, dropping the rail voltage should result in less current so less loss, not more, all else being equal.
Copper loss = I^2 R for an resistor load. For an inductive load such as many loudspeakers, it gets more complicated as much of the loss would be dissipated in the amp, not just in the speaker. It would depend on the phase angle vs frequency characteristics of the speaker.
Here's a good article for you:
Phase Angle Vs. Transistor Dissipation (sound-au.com)
* In case more heat is produced by use of lower impedance speakers at same wattage, would it not make more sense keeping the heat production steady (i.e as if n 8Ohm speaker was driven) by adjusting the voltage accordingly? Why would the manufacturer drop the rail voltage more than that?
Yes, but you are assuming the manufacturers drop the voltage more than necessary, you don't really know that for sure, as you don't know their products current capability. Again, think current, and phase angle, not "power".
* I have not fully understood the certification process, hence, my next question is: is the certification process biased towards use of higher impedance speakers, to the extent that it forces manufacturers to degrade the amplifier's actual capabilities (practically leading to the production of less heat with lower impedance speakers than the other way around)?
Not sure I fully understand what you are asking, but this is a complicated issue that I don't think there is a right or wrong answer even if you clarify your question.