Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

How Long To Charge 24v System...Is My Math Right?

Status
Not open for further replies.

rs14smith

Member
Hi all, I'm just trying to figure out if my math is right for charging 2 batteries hooked up in series to produce 24v from a 20watt solar panel:

Given Info

**broken link removed**:
20 Watts
24v
.61A

**broken link removed**:
24v
35Ah

---------
Calculations:

P = VxI

For the batteries:
P = 24v x 35Ah = 840Wh

Since we already know how many watts the solar panel produces, we can simply divide the watt hours of the batteries by the solar panel wattage (20W):

Time to charge = 840Wh / 20W 42 hours

--------------------------------
So in an "ideal world" would my calculations be correct, as I know there are other small losses to consider, but using the simple math above, would you agree around 42 hours is what it would take to charge the batteries back up to about 100%?
 

Right, as I stated, I know there are some other small factors to consider, but Ideally, are my calculations correct if we didn't have efficiency to worry about? If what I have above is correct, THEN, I will factor in the efficiency numbers. I'm just trying to start simple then work up to the more complex stuff, or in other words, take baby steps :)
 
Last edited:
Right, as I stated, I know there are some other small factors to consider, but Ideally, are my calculations correct if we didn't have efficiency to worry about? If what I have above is correct, THEN, I will factor in the efficiency numbers. I'm just trying to start simple then work up to the more complex stuff, or in other words, take baby steps :)

If you actually get 20W continually off your panels, then your calculations are 'near enough'.

However, you're unlikely to ever actually get 20W, and the maximum you do get will be for only short periods per day - I would 'guess' that you would be doing well to fully charge the batteries in under a week?.
 
If you actually get 20W continually off your panels, then your calculations are 'near enough'.

However, you're unlikely to ever actually get 20W, and the maximum you do get will be for only short periods per day - I would 'guess' that you would be doing well to fully charge the batteries in under a week?.

Yep that I'm aware of too, but yes, my goal is to be able to recharge the batteries within a week. If they cannot be fully charged within a week, I need to use a larger panel. Also, the batteries are likely not to be drained 100% everytime they are in use, so I wanted to first just see if my basic math above was correct before moving forward.
 
Given what you have stated and then factoring realistic charging loses plus the fact that most solar panels will not put out their peak power the whole time the sun is out I would not be surprised to see a charging time up close to 90 - 100 hours on average.
 
There are much higer order losses to consider based on:
1. Lat and longtitude
2. Which way do they face
3. Season
4. Is MPPT employed?

That determines irradiance

Errors due to clouds and rain.

Now we have:

Efficiency of the panels
Efficiency of charger.
Possibly efficiency of inverter
 
You batteries should NEVER be discharged 100%. That is complete destruction. 50% is PUSHING it, unless it is a thick plate, deep cycle. Even then, 50% will reduce the life.
 
You will need a lot more than 24v to charge the two batteries. Set the whole thing up on a very hot day. You will be lucky to get 100mA.
 
You will need a lot more than 24v to charge the two batteries. Set the whole thing up on a very hot day. You will be lucky to get 100mA.

For a start Colin a 24 volt PV panel has an open circuit voltage over 30 volts.....

Now for a 35AH battery array on say a C/10 charge that would mean in a purfect world to get around 3.5 amps 2 off 12 volt 60 watt panels would be needed in series. Now as your location would be under snow for near 1/2 of the year to safely charge that battery array one should consider a 5:1 margin to allow for real world conditions, if PV is the only option on charging the battery. The best bet would be use your 20 watt panel and have a battery charger connected for when the battery goes below 24 volts.

For my shed array which is a 24 volt array the battery bank is 2 off 550AH arrays and I have 500 watts of PV and a homemade motor conversion wind generator. I don't have any other backup charger as due to my work I only use the shed on weekends and the batteries are usually dumping any input power and are fully charged.

I do know this is more info than you asked for on the the original question but it is a real world example.

Regards Bryan
 
Two 12v batteries will rise to about 28v during charging. A 30v panel minus two diode drops = 29v. You have 1v headroom!!!
 
Status
Not open for further replies.

New Articles From Microcontroller Tips

Back
Top