Not quite right. Devices need a certain voltage to work properly. It's the voltage that "drives" the current through the device. You could not hook a 12v device up to a 5v source and have it automatically draw more power.
R = V ÷ I
V = I x R
I = V ÷ R
This relationship is what you need to learn. It's all the same equation, just looking at it from the three different points of view.
Voltage is like a pressure that pushes current through resistance. For a specific resistance, increasing the voltage will increase the current, and hence power will increase. Reducing the voltage will reduce the current, and the power will decrease.
So although your calculations are correct, they aren't showing the proper relationship between voltage and current because you have left out resistance. Since the resistance of a circuit (from a macro level) doesn't change, reducing the input voltage will decrease the power and the device will not work.
Did any of that make sense?