The same goes in reverse, if you step down the voltage and keep power constant then the current will be Increased. I'm not saying how healthy this would be for the circuit - not very, but that's the point.
P=VI. assuming power draw is constant. Voltage goes down. Guess what has to go up.
Are you saying Ohms Law is all wrong?
No, I'm not. Your equation assumes constant power. If you increase voltage on the primary, the voltage will go up on the secondary, it has to. If the load resistance is constant, but the voltage goes up then the current has to go up. Power on the load will go up by the square of the voltage increase divided by the resistance.
So if we have a power transformer and the primary voltage is 100 volts, and it is a 2:1 transformer,, then the voltage on the secondary will be 50 volts. If the resistance across the secondary is 50 ohms then the current will be 1 amp. Now lets keep the resistance constant and increase the primary voltage to 120 volts, the secondary voltage will be 60 volts. New lets keep the resistance 50 ohms. The current is now 60 divided by 50 which is 1.2 amps.
Now the power expended across the load in the first example is the square of the current X the resistance. That is 1 X 50 which is 50 watts. After the increase in voltage to 60 volts, which is 1.2 squared X 50, which is 1.44 X 50 which is 72 watts.
Your equation only holds true if the load resistance is increased to keep power constant. However that would not be the case in the load offered by an electronic device. The load resistance would stay to all intense and purposes constant until the heating effect of the increase in power upped the load resistance slightly.
The real take home is that if power grid voltage rises then the heat goes up by increase in V squared/R, which gives the same result as the increase in current squared X the resistance. Those are the power dynamics of the increase in voltage with constant resistance.