Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Swapping Batteries for a Computer Power Supply to power 16 circuits?

Status
Not open for further replies.

yelm

New Member
I have designed and built an infrared triggered instrument - when an
infrared beam is broken, a single stringed guitar like instrument is
played by electromagnetism coming from a diy sustainer device built
with guitar pickups. The instrument has its own amplifier and speaker.
Arduino (microcontroller) reads the analog Infrared reciever - (when no infrared is detected) the arduino switches power on through an optoisolator by outputing a +5v voltage.

The amplifier and sustainer run from one 9v battery, the arduino from
a computer USB (needs 5v) and the infrared transmitter from 5v from the arduino,
the infrared Tx circuit from another 9volt battery!

Thats lots of batteries and i would like to replace them with a single
power supply. Especially as i would like to install 16 of these
instruments on a staircase.

I would like to use a computer power supply as these output the
following voltages 12v and 5v so these could work in place of the 9v
batteries. The circuits that need 9v will run happily from 12v. However, the computer power supply delivers these voltages at the following Amps:
12V (dc) at 13A,
5V at 15A,
5V at 2.2A - will this blow up my
components? How can i ensure just the right amount of power gets
through to each of the 16 sets of circuits?

If we look at the current ouput from a 9volt battery has between 50mA(0.05A) and 140mA (0.14A) according to
the internet. Lets say for arguments sake 1Amp.

If we look at this equation:

POWER(Watts) = Voltage(Volts) x Current (Amps)

9volts x 1Amp = 9 watts power (produced by the 9volt battery)

9volts x 13 Amps = 117 watts power! (the computer power supply is
rated at 12v at 13Amps, 5v at 15 Amps)

thats a massive increase in watts - will this just burn all of the
components i've put together?

how could i test it without damaging anything?

can anyone help with this?
 
The max output current rating of a power supply is a statement of how much current it can deliver consistent with its ability to regulate its output voltage, not overheat, etc.

The current drawn by any load connected to the supply is determined by the load; not by the supply. As a circuit designer, you simply need to determine if your load is within the capacity of the power supply. If your load requires only 1A at 12V, and you have a 12V 13A supply, that means that your load will only use about 7% of the supply's capacity. If your load requires 20A, but you have only 13A available, then you need to buy another supply.

Suppose you have a load which has a resistance of 10Ohms. You connect it to a 12V supply. How much current will it draw? I = E/R = 12/10 = 1.2A (Ohms law). Does it matter that the power supply capacity is 13A?

Now connect 13 such circuits to your 13A supply. Each load is 1.2A, so 13 of them would draw 13*1.2A = 15.6A, so your 13A supply is overloaded by 2.6A!!!

You need to measure how much each of your circuits draws at each of its voltage inputs. Use a Digital Multimeter temporarily connected in the positive lead to one of your circuits while it is operating. Add up how much total current you need at each of the voltages....
 
Thanks Mike!

Cheers for taking a look at that - thats reassured me that i'm not going to blow everything up!

I will use my multimeter to check how much current my circuit draws at each of the voltages.
 
Status
Not open for further replies.

New Articles From Microcontroller Tips

Back
Top