Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.
Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.
can anyone help me with converting an AC mains voltage to a dual polarity DC output with minimal ripple or noise. what should the circuitry for this consist of?
first of all you need to know what the intended rail voltage and current requirements are. for instance, a 100 watt per channel stereo amplifier requires +/- 50V @ 12A (approximately). so you would need a transformer with a center tapped secondary of about 71 Vac @12A. you would need to add a bridge rectifier that's rated at about 200V and 25 to 40A (when the power supply is first turned on, the bridge rectifier charges up some capacitors, and the charging current can be a lot more than what the amplifier itself uses), and a pair of electrolytic capacitors, with a minimum value of 4700uF @ 85V. increasing the microfarad value will decrease your ripple, but increases the initial charging current when the amplifier is turned on.
this is just one example. it depends on what the power supply is for. for a small audio device using op amps, you might only need +/- 12V @ 1 amp. for a big amplifier, you might need to go in the direction i used as an example.
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.