Thank you all for the explanations. I think I may be unders t anding more now. See, I was thinking that because of Kirchhoffs Current Law that the amps on the line HAD to be equal to the amps consumed. I see now that we are really preserving POWER (wattage), not amperage. In other words (ignoring inefficiencies), in the relationship between wattage and volts and amps, wattage is constant as voltage and amperage vary. I think I am on board with that now.
So, lets break it down to something that I would now use in my tinkering. Lets say I have a 20 amp breaker thats breaks a set of 120 volt outlets in my house. Now lets say I plug in FIVE power supplies that convert 120V AC to 12V DC (i get that ac vs dc doesbt matter, just saying what I would deal with), each rated for 10 amps at 12v. So lets say I leave no safety margin and hook a 10amp load to each of the five power supplies. I would be pulling 50 amps at 12v, or 600 watts. The breaker, if it is a 120v 20 amp, should endure 2.4kw. Does the breaker trip? If the power line to home voltage situation is analogous to this, then I say no. In the situation I described, it is pulling 50 amps x 12 volts equals 600 watts. Thus, the 120v breaker "sees" 5 amps of draw. Those power supplies often have transformers in them, so it must be the same situation as the line voltage to house voltage.
Is that correct?
Bt the way, dont worry, this is just for learning I am not actually about to max out a psu.