Alright. Here's how it works.
There's two different laws in play here. I'm going to assume a 100% efficient transformer, just to make things easier.
You have a 10 ohm resistor. You put 10 volts across it, and get 1 amp. V=IR. Ohms law. You put 100 volts across it, you get 10 amps. Yes, voltage increases, current increases.
But, power in the system has to stay the same. If we have an output power of 1000W (100V * 10A) then we have to have an equal input power. Lets say we have a 10:1 step-up transformer. This gives us a 10V input and a 100V output.
For a 10V input to supply 1000W, we need I = 1000W/10V = 100A.
I hate to say it Painandsuffering, but you're wrong. P on both sides of a (ideal) transformer have to be the same. If it's not an ideal transformer, we need more input power than output power.