The first thing we need to do, is clarify the difference between current (amps) and voltage (volts).
I'll use the water analogy (some people will groan at me :roll: ) - Voltage can be compared to water PRESSURE (psi or kPa). Current can be compared to water FLOW (litres/min). Consider a closed tap. You have a pressure difference (voltage) from the pipe side of the tap to the tap opening. But there is no flow, because the "RESISTANCE" of the tap is very high (in fact, infinite). When you open the tap a bit, you lower the resistance, which allows water to flow (current). Providing that the pressure remains the same, then the more you open the tap, the more water flows. Also, If there is only a little pressure, only a small amount of water flows, but increasing the pressure also increases the flow.
Electrically, this is described by Ohm's Law:
V = IR
where V is volts across a component (or group of components)
I is the current, and
R is the resistance
One of the common misunderstandings that people have, is that power supplies supply a set voltage AND current. NOT TRUE. A supply will (generally) provide a constant voltage, but will only supply as much current as you allow it to. The value of this current will depend on the RESISTANCE of the load. The higher the value of resistance, the less current flows.
(There are also constant current supplies, which do the opposite, but that is another issue altogether.)
voltage drops through resistors
It is better to talk about voltage ACROSS components, not THROUGH them. Conversely, we usually talk about current through components, not across it.
I have a multimeter and when i was testing the amperage on the power supply from above it gave out a reading of 0.8 on the 10amp setting. would this be 800 milliamps or 8 amp?
How were you testing it? It is extremely bad practice to put an ammeter directly across a voltage source. Ammeters generally go in series with a load. On the 10A setting, 0.8 indicates 0.8A or 800mA.