Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Question about current on an electromagnet

Status
Not open for further replies.

Redmond

New Member
Over the past bit I have been working on constructing a functional electromagnet, but so far I've been playing with small voltages and current (about 12 v 1.5 A.) I recently purchased a VARIAC that outputs up to 100 V and 1.5 A, thinking this would be a nice addition to my project (despite the low amperage.) However, I was told that if I were to connect my electromaget to the variac, it would draw to much current and blow the power supply.

Now my question is, how can a load draw more current than the supply is delivering?
For instance, wouldn't my power supply only be able to deliver up to 100 V and 1.5 A? I am fairly new to the theory of electricity, so maybe I'm just looking at it the wrong way, but help or clarification would be very much appreciated.

Thanks in advance.
 
A Variac delivers a variable AC voltage. The smaller the wire, the easier it is for some amount of current to melt the wire. A fuse is basically wire that melts within an enclosure (glass envelope) so it doesn't cause a fire.

Direct Current or DC is better suited for a magnet. The amount of magnitism is related to the current through the wire, not voltage. As the wire heats up, if a fixed voltage were used, the current would be less. Large commercial magnets like I worked with had a magnetic field of 30 kill-Gaus and was water cooled.

Because the magnet is composed of a low reistance wire, the voltage required is low. The magnet I used required about 100A of current at a low voltage, under 10 volts or so.

DC power supplies can be unregulated, constant voltage, constant voltage and combinations such as constant voltage, current limit. Voltage, current and power can be limits where the power supply can operate. In order to have power limiting, the power supply would be called an Electronic Load. I've barely scratched the surface.

Your car has a battery with a short circuit current of 400 Amps or so (Cold cranking Amps) and each circuit is protected by a fuse, so it's all or nothing.

You really need a constant current power supply with a low compliance voltage. For critical situations, water flow and magnet temperature are used as interlocks.
 
So basically, I need to ramp up the current rather than the voltage then, right? I did forget to mention that I have a rectifier that is going to simulate the DC output. I guess my confusion is with the load (in this case the coil of wire for the electromagnet,) why can I measure a larger current in the load than is actually being produced from my 1.5 A supply? Or is the 1.5 A marking on the supply the maximum amount of amperage it can supply before being destroyed?
 
Let me guess, you have a (1.5A) Variac and an ammeter in series with the magnet?

In a transformer, Power is conserved, so (120 V)(Ip) = Vs*Is; Ip max = 1.5A; Is = I(secondary); Vs = V(secondary)

With a proper meter (True RMS), this relationship is true and the amount of AC is the equlivelent of the amount of DC voltage.
Because of waveform differences, meter differences and losses, it's likely your numbers won't track.

With a Variac, you can't exceed the current rating of the Variac on the secondary side. The wire is not designed to handle it.

Also, Variac's do not isolate the AC line, so a shock hazzard is present. Now you can take the output of the Variac and feed the primary of another transformer, say 120 V primary rated at 24 V, 4A secondary, but remember the relationship VpIp=VsIs
 
To put it short, yes, the 1.5A rating is the maximum amperage you should draw, not the maxiumum amperage it can output. This is true of most electronics. The ratings are so you can design your circuits properly without blowing things up, they are not any sort of limit to the device itself (usually.)
 
Last edited:
As smanches noted, the current rating is generally the maximum allowed output, not the maximum it will deliver.

A Variac, for example has a very low internal impedance and will draw many amps if the the output is shorted, at least until the circuit fuse blows.

An exception to this are lab type DC power supplies, which usually have a current-limit circuit that prevents them from being damaged if the output is shorted.
 
Status
Not open for further replies.

Latest threads

Back
Top