I have a small transformer which haves multiple outputs
raging from 1,5V to 12V/350mA. The transformer has many windings
and you can choose the output (I don't know how do you call these transformers)
i've noticed that when i measure it with my multimeter the output
is higher than it should...and from my school i learned that i should use
a "load resistor" paralleled with the output so my transformer can have a small
dissipation.
so...by myself i tried to figure out what i need and i ended up to this: 12x0.35=4.2W
12V because it's the maximum output in volts and 0.35A because its the maximum output
in amperes.
do i really really need 4.2W resistor????
and how am i going to calculate how many ohm of resistor i need?
also can someone explain to me why the output is higher than it should?
Thank you guys!
raging from 1,5V to 12V/350mA. The transformer has many windings
and you can choose the output (I don't know how do you call these transformers)
i've noticed that when i measure it with my multimeter the output
is higher than it should...and from my school i learned that i should use
a "load resistor" paralleled with the output so my transformer can have a small
dissipation.
so...by myself i tried to figure out what i need and i ended up to this: 12x0.35=4.2W
12V because it's the maximum output in volts and 0.35A because its the maximum output
in amperes.
do i really really need 4.2W resistor????
and how am i going to calculate how many ohm of resistor i need?
also can someone explain to me why the output is higher than it should?
Thank you guys!
Last edited: