Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

DC to DC, will this work??

Status
Not open for further replies.

Kingdom Man

New Member
Will a DC to DC Converter work if I want to increase my DC voltage? I have 13.5 VDC and I would like to increase my voltage, I havnt worked out all the details, as to exactly how many volts I need, but It looks like I will be needing some where between 20-30Vdc @ about 1 amp. Any help would be geatly appreciated!
 
That would work. With 30V output and 13.5V input (your worst case scenario), you would theoretically/ideally need an input current 30V/13.5V = 2.22 times the output current.

At 1A output that is 2.22A. In reaity it's going to be a bit more due to losses and other things like that. Since the IC uses an inefficient NPN switch rated for 3A, you might need a small heatsink.
 
Last edited:
That would work. With 30V output and 13.5V input (your worst case scenario), you would theoretically/ideally need an input current 30V/13.5V = 2.22 times the output current.

At 1A output that is 2.22A. In reaity it's going to be a bit more due to losses and other things like that. Since the IC uses an inefficient NPN switch rated for 3A, you might need a small heatsink.


Thanks Very much for the reply,
The switch you talk about (that is also in the data sheet for this part I have listed) what would I use for this switch? I am new to all of this, so I am needing to learn. thanks for any help.
another question is: If you increase Voltage, you increase watts? but decrease amps? is this right? In other words my question is: if I build this step up converter, if I had a light bulb and I ran it off this curcuit would it light the bulb brighter because this curcuit adds more watts?
 
For your particular device the switch is integated into the device so you don't get to choose.

Power in must always equal power out. Watts STAYS THE SAME (in reality it actually decreases due to losses). The circuit does NOT add watts. It does NOT output more energy than you put into it- that is a perpetual motion machine. This means you can change higher voltage with lower current into a lower voltage with higher current, or lower voltage with higher current into higher voltage with lower current.

Now...the lightbulb is a fixed load (a resistance). Which means if you increase the voltage going into it, it will also consume more current and get brighter because it is consuming more watts. Where do these watts come from? THey are coming from the increased watts you are putting into the regulator. THe regulator is NOT making more watts. You are putting more watts into the regulator, and it is changing how the energy is being carried between voltage and current into a form that the light bulb can use. (ie. if the voltage is too low the resistance isn't going to draw more current which means no more watts. So to get more power you have to increase the voltage which makes it draw more current which means more watts).

It's like a lever where you have force and distance/speed, and you can trade off force for more distance/speed or distance/speed for more force. Except instead of force and distance/speed, you have current and voltage.
 
Last edited:
For your particular device the switch is integated into the device so you don't get to choose.

Power in must always equal power out. Watts STAYS THE SAME (in reality it actually decreases due to losses). The circuit does NOT add watts. It does NOT output more energy than you put into it- that is a perpetual motion machine. This means you can change higher voltage with lower current into a lower voltage with higher current, or lower voltage with higher current into higher voltage with lower current.

Now...the lightbulb is a fixed load. Which means if you increase the voltage going into it, it will also consume more current and get brighter. THis is because you are putting more watts into the light bulb. BUt to put more watts into the lightbulb you have to put more watts into the converter as well.

It's like a lever where you have force and distance/speed, and you can trade off force for more distance/speed or distance/speed for more force. Except instead of force and distance/speed, you have current and voltage.

Thank you again, very usefull info.
So does this mean that if I increase the voltage throught the use of this curcuit it will need to draw more current from the voltage in sorce, otherwise it would not light the bulb brighter? Is this right thinking? How would I be able to test if my power source of 13vdc is enough to light the bulb brighter through the use of this curcuit? BTW this is a automotive application, so the supply ultimatly comes from the car`s battery.
 
Im trying to power a ballast that powers a neon light, and I need 3 different wattage ratings at 13.5 input volts. So I have the neon power supply ballast, here are its specs:
Input 12VDC @.6Amps
Output: 1.5KV RMS 10MA
 
Im not sure if that is the solution, I am realy just asking for help to learn how all this works so I can find a solution. I figured If I can up the voltage It would increase the watts and therefore increase the neon bulb intensity. But I could be totaly wrong?? any help would be great.
 
Im not sure if that is the solution, I am realy just asking for help to learn how all this works so I can find a solution. I figured If I can up the voltage It would increase the watts and therefore increase the neon bulb intensity. But I could be totaly wrong?? any help would be great.

You ARE totally wrong, increasing the voltage lowers the current, and because you make a loss the total wattage is reduced.
 
If I know what My Output WATTs need to be How do I limmit my watts/amps if my power sorce supplys too much? I hope this is easyer to do than what I was asking previously.
 
A car battery can supply several amps, your device will only draw what it needs, boosting it outside it's specifications may cause it to fail.

Is this for your car headlights as in another thread?
 
I have a few of the 12 volt neon tube ballasts around here and as I recall the ones I have use a very simple oscillator and step up transformer to get a high voltage High frequency output. The actual output current is limited by a HV capacitor in series with the transformer output.
If yours are similar you may be able to cheat a few More watts out of them by putting in a slightly larger value capacitor or adding a second smaller value capacitor in parallel to the first. All arc tube lighting uses AC current and for mass production the simple capacitive current regulation is very common.

It may work but watch your ballasts because they could overheat rather quickly and burn up!
 
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top