Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Transformer Power/Current Ratings

Status
Not open for further replies.

dknguyen

Well-Known Member
Most Helpful Member
https://coilcraft.com/pdfs/pwb.pdf

THe 1:1 transformers in this datasheet have a 1/4W power rating, maximum voltage of 400V and a maximum current of 250mA (DC).

I wanted to use the transformer to pass a 100Vp@250mA bipolar 50% square wave through the transformer. According to the current rating, this is possible. According to the voltage rating, it is also possible. BUt according to the power rating it is not (100Vx250mAx50% = 12.5W >> 250mW).

I'm basically wondering how the power rating and current tie together. Because from what I know right now, it just seems that if you had a transformer that had insulation capable of withstanding infinite voltage, you could just keep cranking up the voltage to it forever and ever and it would work just fine, given that you made sure the current never caused the transformer to saturate. But if this were the case, transformers wouldn't have power ratings, just maximum voltage and maximum current ratings.

Assuming the insulation can withstand it, can I just keep increasing the voltage on the transformer and have it operating normally so long as the current remains below the rated current (to prevent saturation and heating)? Because from what I know of transformers right now that seems to be the case. But if this were the case transformers would only have maximum current and maximum voltage ratings with no power ratings since they would function as long as you stayed below both of these limits. But real transformers do have power/VA ratings. Is there something about a transformer core that causes more heating when increased voltages are used that gives the transformer a power rating?

THanks.

EDIT: I found this
"What causes eddy currents: Virtually all of the
flux induced by the primary winding is contained
within the core. The core itself is a single turn secondary
linked to all of the windings. A voltage is induced
around the core periphery equal to the
volts/turn applied to the windings. The core material
has a finite resistivity, which translates into a resistance
value around the core periphery. The voltage
induced around the core forces a current -the eddy
current -to flow through this resistance. The result is
rR loss. The eddy current is reflected into the primary
according to the ratio of the primary turns to the
single turn "core secondary". In the primary, it is
considered part of the magnetizing current, although
it is pure loss, and in fact absorbs some of the stored
energy in the core."

which seems to say the Eddy Currents (and therefore some more heating) increase with voltage which would technically answer my question. It doesn't seem enough on it's own though to explain power ratings since they aren't Eddy losses arent always significant.
 
Last edited:
Read the datasheet.

400v INTERWINDING INSULATION - there can be 400v between primary and secondary, NOT 400v across the winding.

250mA current rating - how much current the wire (windings)can carry.
May also reflect the maximum current before core saturation.

250mW RF INPUT POWER - the most RF you can put into the transformer.
Output power = Input - losses


does this help?

JimB
 
Ah crap. I guess have to email the company to find out how much voltage you can stick across the winding then. I got a feeling I won't like their answer.
 
Last edited:
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top