Hello there,
Yes it is true that the CPU uses more current than the 12v line delivers because the motherboard has multiple buck converters on it that can produce more amps at the cost of more volts. A 12v line rated for 10 amps can deliver almost 40 amps at 3v if a good buck converter is used. In other words, it acts almost like a transformer because in theory the buck converter is a true power converter. 12v at 10 amps is 120 watts, and 120 watts divided by 3v is 40 amps, so you can see how this works. Since the CPU uses 1.4v if we approximate this as 1.5v we can divide that 120 watts by 1.5v and we get 80 amps. That is very similar to how a transformer works (with ac of course) but the buck works with DC power.
12v at 5 amps would be 60 watts, and 60 divided by 1.5 would be 40 amps.
Another interesting question related to this is just how can we get 40 amps into the CPU with such small pins on the CPU package. If you look at a drawing of a CPU pinout you'll see multiple pins used for the power supplies, so the current can be divided up among the many pins.
The other nice thing about doing it this way is the main power from the power supply only has to pass 5 to 10 amps of current to the mother board rather than 40 to 80 amps of current which would cause a much greater voltage drop in the wires and connectors. The motherboard buck converters can use much shorter trace runs and doesnt need to use any connectors other than the CPU socket.
CPU's were using more and more power as time went on and that was creating a power density problem so manufacturers starting going to multi-core designs rather than trying to get the higher power heat losses out of the package somehow.