Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Solar Power Supply Question

Status
Not open for further replies.

pmolsen

New Member
I have a circuit that I want to be able to power from either solar or from a regulated power supply. The circuit runs an LED based light that draws 1A.

For solar I am using a solar panel with an M083 solar controller. The controller will charge the battery up to 13.8V.

I want to limit the light input to 12V so as not to damage the LED's, but I want it to keep working even if the battery voltage drops below 12V.

I did some experimenting with a 7812 and a 9v battery. Without the 7812 I get a reading of 8.1v across the battery under load and the light is still quite bright. When I insert the 7812 I am only seeing about 6.8V across the output under load and the light is very dim.

I don't think I can use a zener due to the current drawn.

So how do I limit the voltage to 12V without losing so much when the voltage goes below 12V?
 
Last edited:
I have a circuit that I want to be able to power from either solar or from a regulated power supply. The circuit runs an LED based light that draws 1A.

For solar I am using a solar panel with an M083 solar controller. The controller will charge the battery up to 13.8V.

I want to limit the light input to 12V so as not to damage the LED's, but I want it to keep working even if the battery voltage drops below 12V.

I did some experimenting with a 7812 and a 9v battery. Without the 7812 I get a reading of 8.1v across the battery under load and the light is still quite bright. When I insert the 7812 I am only seeing about 6.8V across the output under load and the light is very dim.

I don't think I can use a zener due to the current drawn.

So how do I limit the voltage to 12V without losing so much when the voltage goes below 12V?

In your case you want to get 12V 1A when the supply voltage from battery varies from less than 12V to 13.8V. One solution may be to use a DC-DC converter meeting the above requirement.
 
You need a Buck-Boost Switching Regulator. Go to this link, put in your numbers.
 
Had a look at the buck boosters. Afraid I am after something a tiny bit simpler than that. The LTC3780 for example requires the 24-pin buck booster IC plus over 20 discrete external components according to the sample diagram.
 
Ok, try SEPIC switch mode powersupply
 
If you are using a 12v - 12watt LED light, it has its own internal buck regulator and will allow for 11v to 13v supply.
 
I am using a light that has 180 LEDs, in series strings of 3 with a 295 ohm resistor on each string. (Yep, that's the actual R value). Total measured current is right on 1 amp. It has no electronics on it at all other than that.
 
Presumably it's designed to run from a car battery?, the nominal voltage of which is 13.8V - I don't see as you have a problem?.

If you're really bothered, then stick a couple of 3A rectifiers in series between the battery and the light.
 
If you want simple and don't mind a bit of wasted power just use a 2 ohm 5 watt power resistor or a couple of rectifier diodes in series. If you're LED's are drawing 12 watts I think the 2 watts a 2 ohm resistor would burn is relatively reasonable, a switch mode supply 80% efficient is only going to be the slightest smidgen more efficient.
 
Presumably it's designed to run from a car battery?, the nominal voltage of which is 13.8V - I don't see as you have a problem?.

If you're really bothered, then stick a couple of 3A rectifiers in series between the battery and the light.

The OP says that the input varies from <12V to 13.8V. If he use 2 diodes to drop the voltage, then at low input the current through LEDs (which has got 300 Ω series resistor with each 3 LEDs) may not be sufficient to give the required light output. Using additional series resistors also will give the same problem. Right?
 
If it is all about efficency, why not simply add a LED or two to the string when the input voltage is high, and then switch out the extra LEDs as the voltage sags. Sure, you loose a few lumens, but the remaining LEDs still get their rated current. You could automate this using some voltage comparitors.
 
Re-arrange the LED array to use a constant current source instead of a resistor, it will work at any voltage within the compliance of the constant current source. It will be especially effective at lower voltages when a fixed resistor is just sapping power.
 
Last edited:
Re-arrange the LED array to use a constant current source instead of a resistor, it will work at any voltage within the compliance of the constant current source. It will be especially effective at lower voltages when a fixed resistor is just sapping power.

Can you pls suggest a practical circuit which will supply a constant current to the LEDs with an input from 11.5V to 13.8V?
 
You might need to use a rail to rail opamp to regulate close to the +VCC rail but the + voltage is only limited to the opamp and mosfets voltage requirements.
 
It depends on the voltage the mosfet gate needs to fully turn on at the required current. If it's used as a low side driver and VCC to the opamp and to the load are the same it would never be a problem I was just being paranoid. If the voltage the FET is switching is much higher than the opamp is supplied with at high currents the mosfet may require more voltage than the mosfet can deliver, but that's rare for a low side driver.
 
Last edited:
Hmmm. Methinks me just gonna leave it like it is and hope for the best. The last thing I want is lots of extra electronics. Another brand of light that I was using had that. It ran the LEDs at 8V and had an inverter circuit that would accept anywhere from 9V to 24V. Problem is within a week I had 2 of them fail completely. That shouldn't happen when driving the LED's direct in parallel since there are no common components that can fail. I might lose a few strings due to the 13.8v stressing them a bit too much. Will just suck it and see.
 
Pick resistors for 13.8 volts and suck up the dimming that's going to occur down to the lower voltage limit. Then nothing will fry ever and you'll just get a slight dimming at the lower voltage.
 
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top