Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Light Source Power Supply

Status
Not open for further replies.

HyperB

New Member
I am developing a prototype light source for scientific experimentation. I want to use 'an assortment' of 6 to 20 quartz halogen bulbs, 10W (possibly 20W) each. The light illumination level needs to be very stable - and consequently I've been advised that a current regulated DC PS is preferred. I would prefer to use a 'programmable' PS so that I can vary the voltage or current, and therefore the light intensity.

A few questions:
1) If I assume the maximum load to be 20 bulbs @ 10W = 200W, at typically 12V DC, then what should be the maximum load specification of the PS ? By my way of thinking, 200W / 12V means 16.6 Amps, correct? What 'extra' power buffer should I add?

2) Assuming for sake of argument I have a 200W supply, what happens if I decide to use only 8 lamps, or 15 lamps? If I use a programmable supply, can't I limit the output so I don't 'fry' :eek: the lamps? What if it's not programmable?

3) Should the series of lamps be wired in series or parallel? Why?

4) Any suggestions for vendors for such a PS ??

Thanks in advance
 
If you are planning to drive them with constant current, they MUST be connected in series. If so, the power supply will force that current through them and the voltage applied to the string of lights will be whatever it takes to do that.
 
Can someone explain WHY there is a difference in a 100W 'load' if it comes from serial vs parallel connections to 10 x 10W bulbs?

And again, if the maximum load will be 200W, what if a 'safe' margin ? What should the rating on the PS for peak load be?
 
Can someone explain WHY there is a difference in a 100W 'load' if it comes from serial vs parallel connections to 10 x 10W bulbs?
The load power is the same, it changes the voltage and current.

And again, if the maximum load will be 200W, what if a 'safe' margin ? What should the rating on the PS for peak load be?
I would say at least 30% more than maximum anticipated load.
 
Can someone explain the following:

I am using a 150W / 12V power supply (MeanWell S-150-12) to provide power to 10W / 12V quartz halogen bulbs:

With 4 or 6 bulbs wired in SERIES, the bulbs barely glow a soft red...

If the same bulbs are wired in PARALLEL, they emit expected 'bright white' light. In fact, with this supply, I can illuminate up to 12 bulbs (10W/12V) but only in PARALLEL.

Back to the same ol' question: Previous posts suggested I should use only a series wiring, but it doesnt work. Can someone please explain what's happening here?

Thanks...
 
"With 4 or 6 bulbs wired in SERIES, the bulbs barely glow a soft red..."

With the bulbs in series you are dividing the voltage, at 6 bulbs that only 2 volts each.
Wired in parallel they all get the 12volts they need
 
Thanks Gerty... I think I'm slowly seeing the light! :)

What I'm after is a rock solid light intensity, from a set of bulbs. What's the practical difference between using a 150W supply at 48V (for 4 bulbs in series) vs. a 12V supply for 4 bulbs in parallel?

With the parallel configuration, I can add/remove bulbs, and tell when one is burned out! But previous posts suggest 'serial is better'... but I still have no idea WHY.
 
Can someone explain the following:
...
Back to the same ol' question: Previous posts suggested I should use only a series wiring, but it doesnt work. Can someone please explain what's happening here?

Your questions imply that you lack an understanding of basic Ohm's law and how it relates to series and parallel connections of equal value resistors. Please read the "resistance" sections of the following:

Ohm's law - Wikipedia, the free encyclopedia

Series and parallel circuits - Wikipedia, the free encyclopedia

Incandescent light bulb - Wikipedia, the free encyclopedia
 
Thanks for the links MikeMl...

Yes you are absolutely correct: I DO NOT understand Ohm's law very well - that's why I came to this forum to ask questions. It's been 30 yrs since I studied basic circuits in college physics - and with regard to electronics, I'm not afraid to state I'm rather dumb!

But I'm a good experimentalist, and I know that the system I am investigating is NOT a simple problem. I dont just have '4 lamps powered by a battery'. In my case, the line voltage (mains) to the power supply fluctuate; the resistance in each lamp changes with temperature, and with age; transmission through the quartz glass envelope changes with age; resistance in the lamp connections changes due to long term corrosion; etc. etc. Life is not simple!

I understand spectroscopy, but not electronics. What I'm asking for, is a little simple advice on 'what makes a good power supply' for my application. When someone says to wire the lamps in series, I want to understand WHY this is suggested - what are the subtle physical / chemical changes to devices that would cause someone to recommend one circuit over another (serial vs. parallel). Sure I can read, read, read... but it's a lot more sensible to broadcast questions to experts!

If someone has some really PRACTICAL knowledge to contribute, I would appreciate it. Thanks a bunch...

HyperB
 
"
What I'm after is a rock solid light intensity, from a set of bulbs. What's the practical difference between using a 150W supply at 48V (for 4 bulbs in series) vs. a 12V supply for 4 bulbs in parallel?"

The difference between the 2 supplies you mention is that at 12 volts you will need a higher
amperage supply than you would at 48 volts.
Current is calculated by dividing watts by volts:200watts(possible total) divided by 12 volts
= 16.6 amps. 200 divided by a 48 volt supply = 4 amps. Thus it would be cheaper to buy a 48 volt supply (providing you had none).
From a control point of view, with a 48 volt supply you would nearly have to burn the bulbs in banks of 4 (disregarding a regulator circuit), that may not be convient for you.
With a 12 volt supply, you'll need more current but could turn on the bulbs individually to fine tune the required amout of light.
 
Thanks Gerty... I think I'm slowly seeing the light! :)

What I'm after is a rock solid light intensity, from a set of bulbs.
Your talk about aging and changes in the lenses sounds like you are trying to get the bulbs to put out s fixed amount of light that does not change? To do that, you would need to run each light from an independent 12V souce which is controlled by a feedback loop which is directed by a light sensor which measures the light's output. Basically, you would need a power supply for each lamp.
 
Last edited:
I agree with gerty. Your best bet is to run the four lamps in series so that they all get the same current. To do that, it will require a supply that is approximately four times the voltage that it would take to run one lamp. The wattage of the power supply needs to be four times what it would take to run one lamp. (All directly from Ohm's Law)
 
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top