LED is an exception to this isn't it?
It's not an exception to the rule but it could arguably be an exception to the way I worded the rule. I said "Any circuit will only draw the amps it needs, no more."
Consider a circuit consisting of a car battery and a copper bar. The copper bar is a resistor, whose resistance is something like 0.00001 ohms. Ohms law says 12V across incredibly low resistance equals incredibly high current. And Ohms law is right; we place the copper bar across the battery terminals and scary noises, flying sparks, intense heat, and possibly flying shrapnel make the proof clear. This circuit consisting of only a copper bar, demands ("needs," as I put it earlier ) quite a lot of current, and the battery does its best to deliver only the 1.2 million amps demanded, and not one amp more..
Now replace that copper bar with a 1Mohm resistor. Ohms law says that only 12 micro amps will flow.
Now replace that resistor with an LED. LEDs don't have a fixed resistance; they have a fixed voltage drop. Lets say you have an LED rated 1.4V/20mA. You apply 12V, current flows and the LED drops 1.4v, and 10.6v is left over. The current of That 10.6v is not limited by any resistance except that of the wire. So the current will be like 10.6v/0.01 ohms =1,060A. So the circuit "demands" (earlier worded as "needs") over 1,000A, and that current will flow through the LED for a very brief moment until it burns out. This led should have a resistor in series with it so limit the current through it.
So the perfect voltage source will always supply however much current the circuit demands, up to infinity (direct short ), whether that is how much current you to flow or not (like as in "ah crap, I didn't want 1000A flowing thru my LED. I forgot the resistor.")