Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

12V Transformer outputing 14V?

Status
Not open for further replies.

freeskier89

New Member
I have a standard radioshack wall transformer with supposedly 12VDC output, and when I check it with a voltmeter it reads 14V. Am I missing something? Or should I take it back to radioshack with a voltmeter to show them the problem, and hopefully get a new one. Or would it be easier to just put say a 3W resistor (ill be drawing probably about 500mA) on it to make it 12V?

Thanks :D !
-freeskier89
 
That situation is quite normal. Those "wall wart" battery adaptors have transformer, rectifier and filter capacitor in them when they output DC and they have that characteristic of higher-than-marked voltage output with no load. Load them down to their rated current and the output voltage will come right down, thanks to the internal resistance of all the circuitry involved inside the various power supply components.

Even a transformer by itself without all the AC-to-DC conversion circuitry behind it will exhibit that same high-output-with-no-load characteristic.

But rarely to you have to worry about the "raw" voltage being too high -- too low is usually more of a problem. The raw AC voltage is rectified, filtered and usually run through an electronic regulation circuit to achieve the desired output voltage regardless of normal variances in the mains voltage and/or secondary voltage on the transformer.


Dean
 
Wow thanks for the virtually instant replies!

The power supply is rated at 1A. So If I am drawing say 500mA I should be fine, and it will read closer to 12V? Or would it be best to just put in some other things just to get closer to 1A rating?

Thanks again!
 
if you realy need 12V at all times you should add a voltage regulator.

Put a LARGE smoothing capacitor on the rectified output of your transformer. 4700µF or so.
This should give you an unregulated voltage of approx. 16V

Then use a 7812 IC to regulate your output to be 12V at all times
 
I know that I am bring my dead post up again, but oh well. I am finally getting around to building this thing that I am working on and I am making the PCBs. I want to make sure my circuit is essentially flawless. My question is did I get your regulation description right exo? I don't seem to understand how adding a cap in parallel will cause the voltage to rise up to 16V. It is just charging to the 16V peaks, and then discharging that 16V when the wallwart is at lower potential? Clarification would be great :D

Oh, one more quick question. Do you think that this heatsink will work for the 7812 with an amp going through it? https://jameco.com/webapp/wcs/store...&catalogId=10001&productId=326641&pa=326641PS

Thanks a lot!
-freeskier89 :D

// I can get rid of the .33uf Cap right, since it is just adding .33 to the 4700uf?
 

Attachments

  • reg.png
    reg.png
    1.4 KB · Views: 933
freeskier89 said:
// I can get rid of the .33uf Cap right, since it is just adding .33 to the 4700uf?
No, the 4700 uF has a high impedence at high frequencies, so it needs a 0.33 uF in parallel since the 0.33 uF has a low impedence at high frequencies. The low impedence at high frequencies is important for the stability of the regulator. It will also attenuate high frequency noise (such as radio stations) from the power mains that may otherwise affect whatever circuit you are powering.
 
freeskier89 said:
I don't seem to understand how adding a cap in parallel will cause the voltage to rise up to 16V. It is just charging to the 16V peaks, and then discharging that 16V when the wallwart is at lower potential? Clarification would be great :D

If you look at the output of the "wall wart" on an oscilloscope, it would show an AC ripple (approximately a sawtooth shape) superimposed on the DC level. If you measure the voltage with a DC voltmeter, it will measure the average voltage.

If, for example, the upper peak of the ripple is 17 V, and the lower "peak" is at 13 Volt, then the meter would read about 15 V. If you then connect a large cap in parallel, it will reduce the ripple amplitude. Say it reduces it so that the lower peak is at 15 Volt (the upper peak of the ripple will still be at about 17 V), then the meter would read about 16 V.

Note that the ripple amplitude will increase if more current is drawn from the "wall wart". Also note that the regulator needs a minimum voltage difference between its input and output. Otherwise, it will not work properly, it will "drop out". This is specified in the regulator's data sheet. From memory (I'm not at home) this reg needs a minimum differential of 3 Volt. So you are "sailing close to the wind".

You therefore need to know the ripple amplitude that will occur at the maximum current. This is important since the regulator may not have enough differential near the lower "peaks".
 
Great! I understand it much better now :D (I assume my schematic is alright)
So do I need to load the wall wart down and look at its performance under an oscillscope, or do you think I can just hope for the best :S. I would test it but it will be hard for me to get a hold of an oscillscope.
Thanks!
 
freeskier89 said:
So do I need to load the wall wart down and look at its performance under an oscillscope, or do you think I can just hope for the best :S. I would test it but it will be hard for me to get a hold of an oscillscope.
Thanks!
Your "schematic" looks alright.
Although a scope would be ideal, you can do it without one.

Measure the DC voltage with a dummy load (to draw the maximum current) and then switch the voltmeter to AC and you will measure the ripple voltage. You may (depending upon the type of meter) need a capacitor (1 uF or greater) in series to block the DC.

If you double this figure, it will give you an approximate measurement of the peak to peak ripple voltage. It won't be exact since the meter (unless you have a true RMS meter) is designed to measure sinewave voltages.

I looked at the spec for the 7812 and it has a typical drop out voltage of 2 Volt. So I suggest you ensure that there is at least a 3 V differential.
 
Alright I will try that later today when I have access to my multimeter and such. If it turns out that the ripple is less than optimal, what should I do? Would it be best just to go over to radioshack and get a 15V transformer if they have one?
 
freeskier89 said:
Alright I will try that later today when I have access to my multimeter and such. If it turns out that the ripple is less than optimal, what should I do? Would it be best just to go over to radioshack and get a 15V transformer if they have one?
When you say a "15V transformer" do you mean a "wall wart" with a 15 V DC output or a transformer? If you mean the latter, then you will need a rectifier bridge.

Other possible solutions would be to:-

1. buy a 15 Volt wall wart which will give you about 15 sqrt 2 Volt DC, ie. 15*1.414 = 21 Volt.

2. use a "low drop out" regulator in lieu of the 7812 such as the LM2940CT-12 which supplies 12V with a max output of 1 Amp. I don't have a data sheet for it so I suggest you download one and look at the "drop out voltage"spec.

I recommend you investigate the second option first. If there is sufficient differential, then you won't need to buy another "wall wart".
 
I meant a 15V wallwart. The LM2940CT-12 has a 1V dropout. I would prefer just to get another wallwart because I would have to order the lm2940 and that would set me back another week. I hope that the transformer has a high enough differential that I do not have to buy another thing :D. I should be able to test it in a little bit.

Thanks!
 
When a regulator has its "dropout" input voltage, it is far from regulating properly since its output voltage has dropped 100mV from its regulated voltage. It isn't much drop for a 12V regulator but is more of a percentage for a 3V regulator.
 
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top