I have designed and built an infrared triggered instrument - when an
infrared beam is broken, a single stringed guitar like instrument is
played by electromagnetism coming from a diy sustainer device built
with guitar pickups. The instrument has its own amplifier and speaker.
Arduino (microcontroller) reads the analog Infrared reciever - (when no infrared is detected) the arduino switches power on through an optoisolator by outputing a +5v voltage.
The amplifier and sustainer run from one 9v battery, the arduino from
a computer USB (needs 5v) and the infrared transmitter from 5v from the arduino,
the infrared Tx circuit from another 9volt battery!
Thats lots of batteries and i would like to replace them with a single
power supply. Especially as i would like to install 16 of these
instruments on a staircase.
I would like to use a computer power supply as these output the
following voltages 12v and 5v so these could work in place of the 9v
batteries. The circuits that need 9v will run happily from 12v. However, the computer power supply delivers these voltages at the following Amps:
12V (dc) at 13A,
5V at 15A,
5V at 2.2A - will this blow up my
components? How can i ensure just the right amount of power gets
through to each of the 16 sets of circuits?
If we look at the current ouput from a 9volt battery has between 50mA(0.05A) and 140mA (0.14A) according to
the internet. Lets say for arguments sake 1Amp.
If we look at this equation:
POWER(Watts) = Voltage(Volts) x Current (Amps)
9volts x 1Amp = 9 watts power (produced by the 9volt battery)
9volts x 13 Amps = 117 watts power! (the computer power supply is
rated at 12v at 13Amps, 5v at 15 Amps)
thats a massive increase in watts - will this just burn all of the
components i've put together?
how could i test it without damaging anything?
can anyone help with this?
infrared beam is broken, a single stringed guitar like instrument is
played by electromagnetism coming from a diy sustainer device built
with guitar pickups. The instrument has its own amplifier and speaker.
Arduino (microcontroller) reads the analog Infrared reciever - (when no infrared is detected) the arduino switches power on through an optoisolator by outputing a +5v voltage.
The amplifier and sustainer run from one 9v battery, the arduino from
a computer USB (needs 5v) and the infrared transmitter from 5v from the arduino,
the infrared Tx circuit from another 9volt battery!
Thats lots of batteries and i would like to replace them with a single
power supply. Especially as i would like to install 16 of these
instruments on a staircase.
I would like to use a computer power supply as these output the
following voltages 12v and 5v so these could work in place of the 9v
batteries. The circuits that need 9v will run happily from 12v. However, the computer power supply delivers these voltages at the following Amps:
12V (dc) at 13A,
5V at 15A,
5V at 2.2A - will this blow up my
components? How can i ensure just the right amount of power gets
through to each of the 16 sets of circuits?
If we look at the current ouput from a 9volt battery has between 50mA(0.05A) and 140mA (0.14A) according to
the internet. Lets say for arguments sake 1Amp.
If we look at this equation:
POWER(Watts) = Voltage(Volts) x Current (Amps)
9volts x 1Amp = 9 watts power (produced by the 9volt battery)
9volts x 13 Amps = 117 watts power! (the computer power supply is
rated at 12v at 13Amps, 5v at 15 Amps)
thats a massive increase in watts - will this just burn all of the
components i've put together?
how could i test it without damaging anything?
can anyone help with this?