I have a few transformers that my work gave me out of some broken UPS modules (power rating unknown). They are pretty beefy, heavy, and have 10GA output wires. I want to use them in a couple 12-14v power supply projects, but don't want to overwork them, yet I'd like to get the most from them.
They have a bunch of numbers on them, but no company info. A Google search on the numbers turned up nothing. So, I determined which wires were which and did some measuring (all voltage measurements are AC):
Transformer 1: A single set of large gauge outputs (no center tap)
Unloaded = 16.21v
Loaded with 10.8Ω = 16.13v (1.49A)
Loaded with 2.5Ω = 15.9v (6.36A)
Loaded with 0.81Ω = 15.4v (19.01A)
I calculated coil output resistance by subtracting the loaded voltage from no load voltage, and then dividing by the current. Naturally, it varies according to the load, but is 0.045Ω is the average.
Transformer 2: This one has a center tapped output, but I loaded and measured across the whole output.
Unloaded = 30.78v
Loaded with 10.8Ω = 30.34v (2.8A)
Loaded with 2.5Ω = 29.05v (11.62A)
Using the same method above, I calculated coil resistance to be 0.15Ω average.
Transformer 3: This transformer (Stancor) DOES have the rating listed on it: 36vAC @ 6A. To verify, I measured the output:
Unloaded = 39.43v
Loaded with 10.8Ω = 37.84v (3.50A)
Loaded with 2.5Ω = 33.82v (13.53A) - over the rating, but wanted a consistent test.
I calculated the average coil output resistance to be 0.454Ω. At the 6A rated output, I calculated the voltage would drop to 36.85v, which is ~93% of the unloaded value. Is this 93% factor consistent for transformers in general, or is there another way to go about this?
Now, is there some rule of thumb in determining approximately how much they can be loaded? For example; could I use the info I have and calculate the output current at an output voltage that drops a certain % from the unloaded value?
Sorry for the long post. And thanks for any assistance you can provide.
They have a bunch of numbers on them, but no company info. A Google search on the numbers turned up nothing. So, I determined which wires were which and did some measuring (all voltage measurements are AC):
Transformer 1: A single set of large gauge outputs (no center tap)
Unloaded = 16.21v
Loaded with 10.8Ω = 16.13v (1.49A)
Loaded with 2.5Ω = 15.9v (6.36A)
Loaded with 0.81Ω = 15.4v (19.01A)
I calculated coil output resistance by subtracting the loaded voltage from no load voltage, and then dividing by the current. Naturally, it varies according to the load, but is 0.045Ω is the average.
Transformer 2: This one has a center tapped output, but I loaded and measured across the whole output.
Unloaded = 30.78v
Loaded with 10.8Ω = 30.34v (2.8A)
Loaded with 2.5Ω = 29.05v (11.62A)
Using the same method above, I calculated coil resistance to be 0.15Ω average.
Transformer 3: This transformer (Stancor) DOES have the rating listed on it: 36vAC @ 6A. To verify, I measured the output:
Unloaded = 39.43v
Loaded with 10.8Ω = 37.84v (3.50A)
Loaded with 2.5Ω = 33.82v (13.53A) - over the rating, but wanted a consistent test.
I calculated the average coil output resistance to be 0.454Ω. At the 6A rated output, I calculated the voltage would drop to 36.85v, which is ~93% of the unloaded value. Is this 93% factor consistent for transformers in general, or is there another way to go about this?
Now, is there some rule of thumb in determining approximately how much they can be loaded? For example; could I use the info I have and calculate the output current at an output voltage that drops a certain % from the unloaded value?
Sorry for the long post. And thanks for any assistance you can provide.
Last edited: