Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Automated Load Tester

Status
Not open for further replies.
The way I read the chart you attached is that if a 110 amp load were put on a 1120 AH single cell battery with OCV of 2.10 the voltage would drop immediately to 2.05 volts. From there the voltage would continue to decline over an unspecified period of time until the battery is completely discharged at 1.7 volts. Can't see anything wrong with that. The one thing I do see is that the line is not linear. It's flat though out the first 50% or so but then takes a pretty good dip. This may present a problem when attempting to say that the bottom half is equal to the top.

Amp hours is one way to quantify the capacity of a battery and voltage definitely goes down as a battery discharges.

However reserve capacity is still yet another matter. They are two different measure of the same thing or should I say two different ways of arriving at the same place even though the route taken is different.
Reserve capacity is a timed value at a given load. The load is always the same no matter what size or type battery you are dealing with. The standard specifies that the load shall be either 25 or 75 amps. The value that represent the capacity of the battery is how long does it take for the battery to completely discharge at that load. That's it, no Amp hours, no C values and most importantly no Peukert. Rating a battery in AH is only relative to other batteries rated in AH. It easy to say an 150 AH battery has more capacity than a 100AH battery but if we should decide to test that capacity what load would you use. C-5, C-10, C-20 and these C values would be different for every battery. AH along with Mr Peukert law is very useful in determining how long a particular load can be sustained on a given battery but it would greatly complicate making a tester.

So you take a fully charged battery and put a 25 amp load on it. The battery voltage will drop but it doesn't matter. We are only concerned with how long does it take for this battery under this load to fully discharge. In the case of a 6 volt battery that would be 5.25 volts. When the trigger voltage is hit test is over. The second the load disconnects the voltage will rebound but it is of on concern. The test was over the second the battery hit 5.25 volts. The times will be different for different type and sizes of battery but the load will remain the same.
The tester you have designed and I am hoping to build is golden for this value of 5.25.

The question I am attempted to get an answer to is the 50% test valid. 6.1 volts is OCV that represents a 50% SOC. If the test cycle is stopped at this trigger voltage does that truly mean the battery under a 25 amp load has reached 50% or will that number have to be adjusted up some what. If it has to be adjusted would that same adjustment be good of batteries with differing capacities? I am hoping to get some input and answers tomorrow.
 
The way I read the chart you attached is that if a 110 amp load were put on a 1120 AH single cell battery with OCV of 2.10 the voltage would drop immediately to 2.05 volts. From there the voltage would continue to decline over an unspecified period of time until the battery is completely discharged at 1.7 volts. Can't see anything wrong with that. The one thing I do see is that the line is not linear. It's flat though out the first 50% or so but then takes a pretty good dip. This may present a problem when attempting to say that the bottom half is equal to the top.

Two things about this:

1- The curves come together at 5.25 volts when the battery is fully discharged. I think this is due to the fact that there is no acid left in the battery - only water. Hence the low specific gravity.

2- The initial drop is a function of battery size (amount of lead) hence a smaller battery will have a larger drop for the same 25 amp load.


However reserve capacity is still yet another matter. They are two different measure of the same thing or should I say two different ways of arriving at the same place even though the route taken is different.
Reserve capacity is a timed value at a given load. The load is always the same no matter what size or type battery you are dealing with. The standard specifies that the load shall be either 25 or 75 amps. The value that represent the capacity of the battery is how long does it take for the battery to completely discharge at that load. That's it, no Amp hours, no C values and most importantly no Peukert. Rating a battery in AH is only relative to other batteries rated in AH. It easy to say an 150 AH battery has more capacity than a 100AH battery but if we should decide to test that capacity what load would you use. C-5, C-10, C-20 and these C values would be different for every battery. AH along with Mr Peukert law is very useful in determining how long a particular load can be sustained on a given battery but it would greatly complicate making a tester.

I agree. The reserve capacity is a way to take into account the C values (or Purkett exponent) of each battery leaving only time as the variable.

So you take a fully charged battery and put a 25 amp load on it. The battery voltage will drop but it doesn't matter. We are only concerned with how long does it take for this battery under this load to fully discharge. In the case of a 6 volt battery that would be 5.25 volts. When the trigger voltage is hit test is over. The second the load disconnects the voltage will rebound but it is of on concern. The test was over the second the battery hit 5.25 volts. The times will be different for different type and sizes of battery but the load will remain the same.
The tester you have designed and I am hoping to build is golden for this value of 5.25.

I agree. However the 5.25 volt value is the only one that is any good. Thats because no matter what size the battery is it is dead at 5.25 volts.

The question I am attempted to get an answer to is the 50% test valid. 6.1 volts is OCV that represents a 50% SOC. If the test cycle is stopped at this trigger voltage does that truly mean the battery under a 25 amp load has reached 50% or will that number have to be adjusted up some what. If it has to be adjusted would that same adjustment be good of batteries with differing capacities? I am hoping to get some input and answers tomorrow.


I think this is the problem and why they are only spec'd at 5.25 volts.
See the attached graph. It represents 2 batteries. One a 250 amp hour at 25 amps and the other a 500 amp hour at 25 amps. Given different times they will both discharge to 50%, but at the 50% level they will have different voltages. The 250 amp hour battery will have a lower voltage at 50% that the 500 amp hour battery. At this level they both have the same specific gravity but the 500 amp hour has more surface area thus a lower internal resistance. At 5.25 volts neither can deliver any power because the chemical reaction ceases.

Whatca think?:confused:
 
I think we may all most be there.
Soon to finish battery 101
The graph you posted is accurate but I "think" your interpretation is a bit off.
I need to use some generalized number for illustration but I aware there are some variations.
A lead acid battery has the following electrical and physical relationship
100% SOC = 2.117 OCV = 1.265 Specific Gravity
50% SOC = 2.033 OCV = 1.190 Specific Gravity
Although these figures can be used to indicate SOC they tell us nothing about capacity. (AH)
A 500 AH battery will have the same values as 100 AH battery at 100% or 50% SOC.
What accounts for the voltage difference show in the graph is that these reading are taken under load. A higher capacity battery will have less voltage drop under load. Both curves track nearly identically, expect the battery with a high capacity will have less of a drop in voltage for any given load. Either one of the two batteries given they are of the same type and construction, with the only difference being there capacity would have the same OCV and specific gravity at 50% SOC
Here's the problem as I see it for the 50% discharge test.
Open circuit voltage of 2.033 will be the same no matter what the battery's size.
But the tester is not monitoring OCV or a battery at rest
The battery is under load the moment 2.033 volts are hit and the test end.
The voltage will rebound reflecting the fact that the battery has not yet reached 50% discharge. The cut off voltage under load will definitely be affected by the capacity of the battery. The larger the battery the larger the error as illustrated by the graph you attached.
 
ronv did you make the graph?
If so nicely done!
It show both the problem and perhaps there is a simple solution.
Let me explain by using an illustration.
The tester is set for a 50% capacity test.
Press the button and start the load cycle.
The 25 amp load remain engaged until the trigger voltage of 6.1 volts is reached
Tester shuts off.
Here where thing need to change.
With the addition of a simple digital volt meter it would be easy to see if the battery rebounds with a voltage higher than 6.1 indicating we have yet to reach a 50% discharge.
Let the battery rest so that the electrolyte has time to redistribute.
I will need to research the time period.
Then press the button again without resetting the timer.
Repeat as necessary until the volt meter reads 6.1 volts OCV battery at rest.
The accumulative time show on the timer should be 50% of the battery's reserve capacity.
In reality this test might take as long or longer than just taking the battery to the 100% mark but it would avoid putting all the stress on the battery.
I suppose the process could be automated
load - rest - comparison to 6.1 volts if greater than 6.1 volts - reload - continue until OCV equals 6.1
That would avoid any errors induced by the operator that might make testing inconstant.
A testing cycle could conceivably run for many hours and it would be easy to get side tracked or distracted. But and it a big but this would add yet another layer to the project.:eek:
 
My experience with this method goes back to my golf cart. I built a "gas gage" for it that measured open circuit voltage in 10% increments. It would bounce back 1 or 2 increments in a few seconds (2 or 3) and with more time when the batteries were going dead it would bounce 2 to 4 in a few minutes. It would take a lot of data collection to establish the best time. While it could certainly be done it begs for a microprocessor rather than hardware especially if the rest time gets long.
Here is another proposal: I know you don't want to discharge 100% because of the damage to the battery. But how about we optimise the current for say the 75% point of your average battery - ~ 5.825 volts. (What is the size range by the way) Then set another point at 5.25 volts. Use the 75% point as an indicator of battery health and if suspect change the scale and let it run to the 100% point. You would have to log the 75% points to compare year to year but you probably do that in any case. I think you would soon develop a correlation between the 75% point and the 100% point for a given battery size.
 
But wait there's more! :) While the curves are not exactly linear between 75% and 100% they are pretty close untill the end. The reserve capacity number (minutes) already takes into account the battery capacity so I think you can interpret the curves given the reserve capacity. So if the reserve capacity number is say 220 minutes I think you could use 187 minutes at 75%.
 
I feel as though we are reaching a convergence of though here and will perhaps get to the workbench soon.
I originally chose the set points of 50 - 80 and 100% because they match the set points in actual usage. Someone at sometime determined the most cost effective use of a deep cycle battery was to operate at or near its 50% capacity. That is to use the top half of the battery's capacity. A life cycle curve depicting a battery's expected life takes a deep drop after 50% and a still worse almost vertical drop after 80%. It maybe acceptable to discharge a battery to 80% on occasion but never beyond that point. When you look at the life cycle curve it may come to mind why not design a battery bank to operate in the top 20% and there by extend their life even more. Well as it turns out you would extend the battery's life but the charge efficiency of a battery is not linear. No battery can store all the charging current without some loss. In other word 1 amp in does not equal 1 amp stored. Some batteries claim an efficiency in the mid 90% but as I said even that is not linear. The closer to 100% SOC a battery gets the more amps that have to be dumped in to store 1 amp or the efficiency drops. So it is easy to fall into the trap of thinking your doing yourself and the battery bank a favor by increasing its capacity but it reality you will have a less efficient operation.
I person like myself attempts to size the battery bank properly and then put in place some safeguards. I set the system to produce and audible alarm at 50% and disconnect at 80%.

It think, please correct me if I'm wrong, that we have arrive at the conclusion that any capacity value short of 100% would be relative only to another test run on the same battery and then its only value would be as a compassion to an earlier test. When I say only that's not to imply that such as test would not be useful. It is just to say that it would not relate to the manufacture's reserve capacity rating. As far as I can see any set point above 100% would have the same relative value. The tester as designed meets this criteria.

The only time I can foresee needing an actual 100% capacity reading would be to calibrate a battery monitor or for warranty purposes. Trojan battery states that a battery has reached end of life when it has lost 50% of its capacity. This is something that occurs over time. If I am setting up a new bank of batteries calibration is simple. I am going to use the specification sheet but what happens when 2 year later I need to recalibrate the monitor. How much of the battery bank's capacity has been lost. Since I am operating in the top 50% and the battery life is at an its end at 50% I am really look at the entire useful capacity of the bank. A 10 or 20% error makes the monitor an expensive but useless piece of hardware. Here the 50% discharge test take on another light.

I could with the tester as it stands bite the bullet and do a complete 100% discharge test or do a repetitive 50% discharge test and double the value.

I did some research:
Surface charge is the uneven mixture of sulfuric acid and water within the surface of the plates as a result of charging or discharging. It will make a weak battery appear good or a good battery appear bad. You need to eliminate the surface charge.
Allow the battery to sit for four to twelve hours to allow for the surface charge to dissipate.
I question for the purposes of this test weather or not it would be necessary to let it sit so long. One hour or perhaps even a few minutes would be enough to allow the OCV voltage to rebound above the trigger voltage and then another load cycle would resume. When the test stop cycling then let the battery set for 4 to 12 hours and recheck. Here the need for the constant load maybe more necessary as the cyclical nature of the test would be heating up and cooling down the load resistors a lot. To tell the truth I like the idea of the constant load source. It doesn't seem to add that much to the cost and complexity of the project. But it does eliminate one of the many variables.

Got to get up and cook some dinner. Even though I thought I new a lot about batteries these conversation have shown me there's more to learn but without your support I would not have even been able to approach the project at all. Thanks:)
 
Forget what I said in post 86. The Purket # only fixes the capacity problem not the different voltage drop between 2 different size batteries at say 50 or 80%.

Yes, I think you are correct, the only true measurement is the 100% discharge number, everything else must have history. A lot depends on what you are trying to do with it. I am convinced that 100% is the only value that the manufactures will accept, so if we don't like that the next I thing is a 75 or 85% point that you can try to correlate to 100%.
I think the variation in the load will insignificant when compared to the variations in the battery.
 
Last edited:
Not sure about Mr Peukert and his law but here what I know from hands on.
If you have two batteries with the same SOC but differing capacity.
When you place a load on them there will be larger voltage drop shown on the battery with less capacity.

I have yet to have a battery fail under warranty. Maybe lucky or maybe it is not common, can't say. When I have had batteries fail it is usually not a capacity problem. That might be in part because I have never had an accurate method of measuring capacity so they go on until there is a complete physical failure.

I know for a fact that batteries lose capacity as they age. If I have the ability to measure that loss I can do the following

Make prediction as to where the battery bank is in its life cycle

Correct balance problem in the bank

Make the necessary adjustment to the solar system.
Charging current

Calibration of the battery monitor.
Two of the system have a Vctron Energy BMV-600S monitor installed
This is an unassuming meter from the outside but inside it measures voltage and current.
It monitors the amperage in and out of the battery bank. Then with the add of a microprocessor and some logic it takes into account charge efficiency and Peukert effect.
End result is a number of displays but some read very much like a gas gauge.
It works very well as long as it is initially programed with the correct battery capacity.
That is easy to do with new batteries but as they age it requires testing at least a couple of the batteries for their current capacity and the multiplying that times the number of batteries.

I realize that the closer we get to a complete discharge the less divergence there is but once we deviate from the 100% mark I can't see there is much difference between them. All are only relative to a previous test.

I think the tester would function as it stands on the drawing board. I would like to complete the thought process for a constant load if possible and I can experiment with a volt meter.

What do you think?
 
I did some simulations with LTspice IV using a PWM drive (thru an RC filter to control load currents) from an MCU which does ADC sampling across the load resistors to determine the current flow. THE MCU handles the closed loop control to maintain constant current no matter what voltage the battery is at. It also tracks the precise time to discharge and can calc the Peukert Index to profile the battery. The MCU then logs the data, and makes it available via RS232 over USB to any PC app. for more analysis or printing a battery condition label to stick on the battery.

My design looked at 4A and 8A CC loads based around FET switched precision load resistors. Altering the load is trivial. The 2 precise discharge currents are necessary to derive the Peukert index.

The MCU also recharges the battery and pulse charges it if desulphation is required to repair battery health.
 
Mosaic
Welcome to this rather lengthy thread discussing and attempted to build a load tester.
I am a tinker, builder and an individual with a need to build a reliable test instrument.
I think I may have seen your name somewhere in previous post or perhaps in another thread.
ronv and another ron have pretty much mothered this idea along so far and they will be better able to understand what you are proposing in this last post.
I can understand some of what you are saying but need help with the rest.
Here's what I read
You used a simulation program to design a circuit for a pulse width modulated load in conjunction with an RC filter and a microprocessor. My only encounter with ADC sampling is with audio circuits and I am having trouble seeing how it fits in.
There are enough words in there that it sounds intriguing but I need some help in understanding what exactly you are proposing and how it works.
 
Yep, I think we are at the crossroads.
I can integrate the constant current load into the existing simulation and see how it looks. I also want to think about Mosaic's PWM idea. Up to this point I have tried to stick to something you can test with a voltmeter, but I don't like the big heatsink needed for the transistor in the constant current design. I'm also not crazy about so many small resistors just to keep the power down in the transistor. Do you mind the heatsink? I've also stayed away from microprocessors, one because I am not very good with them and 2 you have to get all the "stuff" to program them. But let me take a look at the PWM idea. This would set a current at say 27 amps for the lowest voltage 32 for the highest then only turn the current on for a portion of a cycle to get 25 amps average. This is done at a fairly high frequency. I'm not sure of the battery response to this but I could probably test it on a small battery here. If I remember right from the time I built the switched capacitor ballancer the current rise time was very fast into the battery.

This method has the advantage of being on or off so very little power is disipated in the switch. I'm not going to have much time for the next few days so don't think I gave up on the project.
 
Last edited:
Battery recovery

I go with microcontroller based systems for the convenience of being able to embed all the controls required, all the MMI (man machine interface) required and the ability to upgrade or change function with a software revision.

The ADC in the uC is used to sample voltages, these can derive into currents once the resistance of the load is precisely known. Thus the uC will know all the voltages and currents coming out of the battery at anytime. The uC can be supplied with a precision voltage reference if you need sub 1% accuracy.

So the uC drives the load transistors to produce the required load current for a prescribed time or until the batt V drops to a preset voltage. The uC can then calc & display the Ah rating of the battery.

The system I was designing aimed to do the following (unattended):

1) Apply a lead acid charge regimen based on the battery type, eg flooded, AGM, gel cel etc.
2) Apply a CC discharge to a preset voltage to determine battery capacity.
3) Recharge the battery
4) Tracking of battery data in onboard memory
5) Switch to another attached battery and restart from item 1.

At any time I can dump all the battery AH ratings matched to battery serial numbers to a PC or Laptop.

Battery recovery mode (unattended):
1) Apply pulse charging with 36VDC hi current pulses to desulfate the battery.
2) Discharge the battery to a fixed SOC (voltage based). Derive Peukert Index of battery.
3) Evaluate if batt AH capacity is improved by the pulse charging
4) Restart from Item 1 if Item 3 is +ve, else recharge battery and move on to another attached battery.
 
When I look at the pwm method I come up with say a 81% duty cycle with 31 amps on and 0 amps off for the fully charged battery. I'm not sure what effect this has on the average current in the battery. And of course it puts little bumps on the battery voltage that really should be filtered out. Also adds a sense resistor and amplifier. So I'm coming out on the side of the linear current source as actually being simpler and more accurate. But I still keep coming back to why do we need the super accuracy if the measurement is a relative one. I can see repeatability as important but not really 24.9 amps or 25.1 amps.
 
Now I am experiencing the frustration that must come with any attempts to develop something new and then finding the solution was right in front of me all along.
Originally way back in post 1 I proposed the idea and ask you all for support in making a capacity tester for lead acid batteries.
I opted to use voltage as the marker for SOC full well knowing some of the inherent difficulties associated with this but it seemed a better alternative than trying to track amperage as that would involve many calculations.
There are at least three battery monitor on the market currently.
**broken link removed**
https://www.bogartengineering.com/products/TriMetric
https://www.xantrex.com/power-products/power-accessories/linkpro-battery-monitor.aspx
All of them are very similar in function and I wouldn’t be surprised if they all use the same chip.
I had misinterpreted their functionality because as I understood it at the time, for the monitor to be accurate it was necessary to know and then manually enter the capacity of the battery. This is relativity easy for a new installation but becomes increasingly difficult as the battery ages and losses capacity. None of the monitors have any provisions for directly measuring the initial capacity. So I figured it would be necessary to come up with this value and that was the genesis of this thread. Last night I think I got a different perspective from the other end of the funnel.

Please review this thought process and by all mean help me find any problems with it.

In general all three monitor function like this.
These battery monitor calculate the batteries SOC based upon:
A battery is judged to be fully charged when the charging voltage is greater than or equal to a preset value (Vc) and the charging current has drop to a value less than another preset (It) value for a preset (Tcd) period of time
At this point the battery is said to be fully charged and the monitor resets itself to 100%
Then the monitor begins to follow any charging and discharging that occurs.
Amp Hours in or Amp Hours out
Using a microprocessor and its internal algorithms the monitor compensates for Peukert’s effect (PC Peukert exponent preset) and Charge Efficiency (CEF preset)
Only one the Xantrex compensates for temperature
The displayed values indicating SOC are calculations based upon the 100% SOC mark and the battery capacity (Cb preset)
These values are continually tracked until the battery meets the criteria stated above for a fully charged battery where upon the monitor is reset to 100%

All three of the monitor have a function where you can set a floor for percent of discharge at which point a warning is issued and there is a NO/NC connection that can be used for an external device.
So with one of these monitor and a data sheet similar to the one attached I think my problem can be resolved.
Set the monitor up and enter all pertinent data (presets). Enter the battery or battery bank’s capacity as it is listed on the specification sheet. Set the monitor to a 50% discharge floor place a load any load on the battery and allow it to reach the 50% mark.
At this point a simple relay could disconnect using the contacts provided in the monitor.
Load cycle over and the values provided by the monitor are the total amps discharged and the total Ah supposedly remaining. Let the battery rest and check OCV (function provide within the meter) or if possible check actual Specific Gravity to see the actual SOC. Using the attached example for a 186 Ah battery it would have had 93 AH taken out of it, have 93 Ah remaining and indicate a 50% discharge level on the monitor. Compare that to the actual current SOC as measured by OCV and or Specific Gravity, is it equal to or lower than 50%? How much lower?
Say the OCV = 6.02 with a Specific Gravity of 1.162 or an actual 30% SOC.
That’s 20% lower or 37 Ah of capacity have been lost. There fore the battery now only has a capacity of 149 Ah
For any future use the monitor needs to be reset to reflect the current battery capacity of 149 Ah.

I think we are golden here but I may be chasing a chicken through the barn yard.
The only thing the monitor is measuring is discharge current buffered or filtered by the Peukert Exponent. The Peukert exponent is derived from the actual construction of the battery and does not change with the capacity of that same battery.

What do you all think?:confused:
 
I think these guys are to give you an estimate of remaining capacity so you don't discharge the battery to far. I think all it really does is:
You enter the capacity
It keeps track of current in minus current out and time.
When the amp hours are below say 50% (Of the value you entered) it can shut of a load or sound an alarm.

What I don't think it does:

It doesn't know the actual capacity of an older battery.
 
Your right and all the brain damage I did attempting to make it do so were for nothing.:(

I thought I was out of the woods but not so.

Today I had the good fortune to have an extended conversation with one of the engineers at US Battery. I learn a great deal and it was really nice to have an expert to bounce some idea off.
He said that my idea works mathematically but it practice it does not work. In short it has to do with the fact that a lead acid battery is not an exacting thing and measuring the SOC likewise has an unpredictable amount of error in it. Even when measuring SOC by specific gravity the error factor is unpredictable and too high. That was a real surprise to me. Everything about a lead acid battery is logarithmic and nothing is linear. There is only one point at which all the lines converge and that is at the point of complete discharge or 1.75 volts per cell. All battery capacity ratings by all the different manufactures are base on the on this single point of reference because it the only place they can be.
He went on to say that even testing the same battery with different loads will produce different capacity values. The most accurate load would be at or near the C-20 value for any given battery. In reality and in the field this is not practical. The load is too small, the discharge time would be too long and it would require a different load value for almost every battery. So testing for reserve capacity where the load is fixed and the only variable is time is the way to go. Whoopee! At least I got one thing right.

Design changes
Functionally the tester you have designed ronv will work with these changes.
Forget about the 50 and 80% they won’t work.
The tester instead needs two LVD points 5.25 and 10.5. May as well make it work for 12 batteries.
Since OCV is such an unreliable means of measuring SOC the green go no go light can be done away with also.
However the constant 25 amp load, the more constant the better is still necessary. I know we have discussed this at length but is it possible to make a constant load with input voltage varying for 5 to 13 volts. :confused:

Sorry so much of this information has come about so late but after today conversation I can’t see any other cliffs.
 
Seems like you can't go wrong testing to the battery manufactures specs. In all the stuff I have read the 5.25 value is the only one that seems bulletproof. We can make it for 6 and 12 volt - keep in mind the resistors will have to be twice as large - 6v X 25 A = 150 watts while 12v X 25 A = 300 watts. How should we decide which type - 6 or 12 volt. It might be safest to have the tester do it based on OCV - more complex but safest. It doesn't add much to have the fully charged LED so lets keep it just so you know you are starting with a "full" battery. It's going to be fairly large. Is that okay?
 
Not only bullet proof but it's the only place all battery meet. My dad used to say we've all equal when they nail the box shut.

It would be nice if it were possible to make the tester work for both 6 and 12 volts but if it complicates thing too much then for right now I would opted for 6 volts.
It's just that it would make the test instrument more versatile the other way.

When I was being enlightened by the engineer from US Battery we spoke about making a load tester. When I mentioned a simple resistive load he flatly stated it won't work. He said it had to be a constant 25 amp load that remained constant even with the voltage fluctuation.
Are there any alternative to a straight up resistive load. What was Mosaic talking about?

The test standard is how much time will it take for a battery to completely discharge (1.75 volts per cell) under a 25 amp load. When the trigger voltage is hit the battery is under load.

If you decide the dual voltage tester is not out of reach then the go/no go indicator would also have to respond to two voltages or there would have to be two lights.

I don't have any size constants. Of course the smaller the better but even it were as big as the attached picture that would be fine. It's a load tester I already have for testing automotive batteries. Different procedure completely. Lots of amps for only seconds but a case similar to this might work nicely.
 
I think we are on the same page. Once we go for the 5.25 number it has to be constant current because the curve is non linear for that part so the average is no good. The 6 or 12 volt option will be more complex but like my daddy used to say it only costs a little more to go first class. :rolleyes:
As to size: I think it will take 11 of the 4" long resistors, a heat sink about 3X5X2 and the printed circuit board about 2X the last one. Are you sure you don't want to put in a 50 or 75% point? I'm still convinced this measurement on the same battery year to year will give a figure of merit - not a go no go but a comparison to the last test on the same battery.
Do you have a clamp on current probe to set the constant current?
If we don't have the big heatsink and resistore outside the box in free air we would probably need a fan to move the hot ait out of the box. A screened in section around the hot stuff might work best. Whatcha think? :confused:
 
Status
Not open for further replies.

Latest threads

Back
Top