Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Why do they list amperage on an AC adaptor?

Status
Not open for further replies.

jac4b

New Member
Sorry, trying to understand the basics. So let's say a wall charger says "OUTPUT: 3V 500mA"...why is it telling me the amperage? Isn't the amperage based on whatever the device draws? Or does that mean 500mA max? If I use a different charger that says it outputs 3V and 800mA, would it fry the device or would it just pull 500mA?
Thanks!
 
Last edited:
The current is the max you can draw. A device that uses say 500mA will be fine on a transformer that can deliver at least that amount, so using an 800mA transformer will be fine...unless you are talking about LED's because they need to be limited in the amount of current you give them.
 
Sorry, trying to understand the basics. So let's say a wall charger says "OUTPUT: 3V 500mA"...why is it telling me the amperage? Isn't the amperage based on whatever the device draws? Or does that mean 500mA max? If I use a different charger that says it outputs 3V and 800mA, would it fry the device or would it just pull 500mA?
Thanks!

The device, any device will try to draw its rated current. Let's forget the charger analogy for a moment. If I have a supply rated at 3 volts @ 500 Ma and I try to light a bulb rated at 3 V @ 1.0 Amp what will happen? I will overload the supply as the load is demanding more than my supply can deliver. Now let's say I have the same supply but I have a bulb rated at 3 V @ 250 mA. My load now will only require 1/2 of what my supply can provide. The current draw or power used is a function of the load and not the supply as long as the supply is rated to provide it. Even though my supply can provide 3 V @ 500 mA if my load only requires 250 mA that is all it will use or take.

When I plug my cell phone into a charger the phone will only draw what it needs and no more regardless of what the charger is rated to deliver as long as the phone does not try to draw more that the charger is designed to deliver.

Does that make sense?

Ron
 
The second reason why listing the current capability of a power supply (plug Pack - in Australia) is this:
Suppose it listed only the voltage: 12v PLUG PACK.
You would not know if the plug pack could supply 300mA or 1amp.
If you connected a load that demanded 800mA to a 300mA plug pack, it would get extremely hot and possibly get damaged. In other words it is the CURRENT (in AMPS) that determines how hot a plug pack will get and this is a MOST CRITICAL FACTOR.
 
And the hot plug pack in question would be supplying 800 or 300mA? 300 right because that's it's max, which would force the device to function at 300mA? And it gets hot because it's dispersing current through internal resistors as heat?
 
The current draw or power used is a function of the load and not the supply as long as the supply is rated to provide it.

Gotcha, so it's not like the charger "pushes" extra current, but this is only when the load contains sufficient circuitry to prevent overload right? Otherwise why is it a different scenario for applying too much current to an LED? How can an LED be rated for 3V but then when you apply 3V it's too much? I guess I'm thinking of emf as the only variable.

To put it another way, if I'm working on a circuit with 3V, is there a scenario where I can get hurt from 3V? For there to be a high current in such a circuit there would have to be a tiny resistance, which would change the second I touched it (or put myself in series with it).
 
Gotcha, so it's not like the charger "pushes" extra current, but this is only when the load contains sufficient circuitry to prevent overload right? Otherwise why is it a different scenario for applying too much current to an LED? How can an LED be rated for 3V but then when you apply 3V it's too much? I guess I'm thinking of emf as the only variable.

To put it another way, if I'm working on a circuit with 3V, is there a scenario where I can get hurt from 3V? For there to be a high current in such a circuit there would have to be a tiny resistance, which would change the second I touched it (or put myself in series with it).

There you go. If I have a 5 Volt charger rated to deliver 1,000 amps it matters not as when I connect my cell phone it will only draw a few hundred mA.

Now the LED is actually not too different. If I have a 3 Volt LED that has a forward current of 20 mA and I apply a well regulated 3 volts the LED will draw its rated 20 mA current. However, that is seldom the case. Generally we want to power low voltage LEDs with voltages like 5 or 12 volts. So lets say I remove the little incandescent lamp from a two D cell flashlight. That little bulb is a 3 volt bulb designed to work and glow nicely at 3 volts. So what happens if I place 12 volts across that bulb? It will glow real, real bright for a real short time and poof, it's toast. Because of the much higher than rated voltage the poor bulb will draw much more than its rated current. Actually about four times as much. The voltage is the problem causing a much higher current draw.

The LED is no different. If I want to power a 3 volt LED rated at 20 mA from a 5 volt source I need to limit the current. I need to get rid of a few volts as I want 3 and have 5 so 5 - 3 = 2. So I know I want my current limited to 20 mA and I know I want to drop 2 volts. Therefore I can use Ohms Law and do some simple math. 2 Volts / .020 Amp = 100 Ohms. Thus I want to place 100 Ohms in series with my LED. The LED will see 3 volts across it and the 100 Ohm resistor will see 2 volts across it and thus there is 3 + 2 = 5 and my 5 volt source. Doesn't matter if my 5 volt source can deliver 1,000 amps as my LED will only draw 20 mA in my circuit.

Ron
 
Last edited:
And the hot plug pack in question would be supplying 800 or 300mA? 300 right because that's it's max
If you put a device that wants to take say 800mA on a 300mA plug pack, the device will take more than 300mA but less than 800mA because the voltage delivered by the plug pack will drop a certain amount and this will limit the current.
But the device will certainly take more than 300mA and the plug pack will get very hot. Exactly how much current and how hot it will get is unknown.
But the fact is this: the 300mA rating is a maximum that SHOULD BE taken from the plug pack as this amount of current will heat up the plug pack to its MAXIMUM .
 
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top