Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

High voltage voltage divider circuit to run an LED?

Status
Not open for further replies.

HR19

Member
OK, so I'm familiar with the basic idea of voltage divider circuit. I have a system running at 62V max, 54V nominal. I wanted to have a little alert LED, so I was thinking about using 10, 1kohm resistors, and adding the LED bewteen the 9th and 10th (or 1st and 2nd, if it matters), to get at most 6V and just a few mA. Is a better/easier way? Is this a really crappy way? Or is this a perfectly fine way?
 
You seem to want to make a resistive devider, which is not needed. The LED needs to have the current limited, that's about it. So, if you use a 10k series resistor you'll have about 6mA of current going through the LED. You'll have to make sure you don't overheat that series resistor, though. About 60V on a 10k resistor will burn about 0.36W in that resistor. Check the type of resistors you have for their power rating, derate by at least half, use multiple resistors to share the burden if needed.
 
You seem to want to make a resistive devider, which is not needed. The LED needs to have the current limited, that's about it. So, if you use a 10k series resistor you'll have about 6mA of current going through the LED. You'll have to make sure you don't overheat that series resistor, though. About 60V on a 10k resistor will burn about 0.36W in that resistor. Check the type of resistors you have for their power rating, derate by at least half, use multiple resistors to share the burden if needed.
I have a bunch of 1k, 1/4W resistors, so 10 of those will work fine. I guess that 6mA at 60V is only 360mW, which might be a bit high. I could always use a lot more resistors and experiment by reducing it until I get to a suitable brightness.
 
It's easy enough to calculate what values you need and the power dissipation of the resistor(s). I tool the lazy way out and used an LED calculator program but you might Google to understand the math.


Screenshot_20210108-064016_Electrodoc Pro.jpg
Screenshot_20210108-064037_Electrodoc Pro.jpg
Screenshot_20210108-064108_Electrodoc Pro.jpg
 
Hmm, so I wouldn't need to divide the voltage? If I just use big enough resistors, it'll still work? If I use 10k in resistors, I'll get about 6mA, at 60V would be 360mW across the LED, wouldn't it? Or would it be 360mW split across all the resistors and the LED?
 
This picture shows the usual way to use LEDs.

The series resistor drops voltage to control the current through the LED - the current through the LED is the important parameter. The forward voltage of the LED must be used to calculate the series resistor. Red LEDs have a forward voltage of about 2.2 volts. White LEDs have a forward voltage of around 3.7v - it depends on the color and chemistry of the LED.

The voltage across the resistor is the supply voltage - Vf of the LED. Use Ohm's Law to calculate the resistance for the desired LED current. 5mA is usually plenty for modern LEDs. Then calculate the power dissipated by the resistor. Its rating must be greater than this calculated value.

For this case, the power dissipated by a 10k resistor is about 360mW. You can use the same math (do it and show your work) for a string of 1k resistors.

You could have learned this by watching any number of YouTube videos.....

led calc.jpg
 
I find that modern LEDs for indication are bright enough with just 1mA in which case a much higher resistor can be used.
With a 47k resistor the current will be ~1mA and power dissipation less than 100mW.
Might be worth experimenting.
Note, if you need this visible accross a lit room then a higher current may be advisable.

Mike.
 
There is a possible advantage in having some resistor in parallel with the LED. With a 47 kOhm series resistor, the LED will start to light with around 2 V, and will get steadily brighter from 2 V up to the 60 V.

If the 60 V is turned on and off by a switch, and the switch has some leakage current, the LED will glow. A 1 MOhm leakage resistance will probably make the LED glow.

Similarly, if the voltage is say 10 V not 60 V, the LED will be visible.

If you add a 10 kOhm resistor in parallel with the LED, that will stop the LED lighting if there are leakage currents or there is a low voltage, but will have very little affect with a 60 V supply. With 10 kOhm in parallel and a 47 kOhm series resistance, the LED will start to light with about 11 V or when there is a leakage resistance of about 300 kOhm.
 
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top