Battery power in the wild – suggestions wanted

VegasF6

Flashlight Enthusiast
Joined
Dec 5, 2007
Messages
1,449
Location
Las Vegas
While that does make sense, let me point out a few things.
Every time you convert that energy to another form, you are losing some of it to heat.
Charging efficiency for nickel batteries is around 66%. Charging efficiency for lithium ion cells is nearly perfect, 97-99%.
Lithium batteries can be charged in parallel, and with simple voltage regulation won't be over charged.
 

HarryN

Flashlight Enthusiast
Joined
Jan 22, 2004
Messages
3,977
Location
Pleasanton (Bay Area), CA, USA
Hi, I didn't find exactly what I was looking for, but here are some examples I have seen.

REI is a big camping outfit in the US, and their site is rei.com. I just searched there for solar chargers, and found this link:

http://www.rei.com/search?query=solar+chargers&button.x=64&button.y=21

Maybe something on that page is interesting.

Here is another closer to what I remember from global solar. Serious budget required.

http://www.batterystuff.com/solar-chargers/P362watt.html

and more

http://www.batterystuff.com/solar-chargers/foldable/
 
Last edited:

guyburns

Newly Enlightened
Joined
Apr 13, 2011
Messages
27
Charging efficiency for nickel batteries is around 66%. Charging efficiency for lithium ion cells is nearly perfect, 97-99%.
Lithium batteries can be charged in parallel, and with simple voltage regulation won't be over charged.

Efficiency of charging was something I considered in passing, but haven't looked into yet in detail. But I will now, in a serious way. If the efficiency figures you quoted are correct, I may have to make some changes.

Having said that, the efficiency of 66% you quote for NiMH looks like it refers to the "capacity efficiency" based on a charge rate of 0.1C for 15 hours – the charge rate typically listed on AA batteries. That amount of charge equates to 1.5C, therefore the efficiency is taken to be 1/1.5 = 66%. And that is correct, if a cell is charged at 0.1 C for 15 hours. However, anyone with a charger that measures capacity can easily determine that you don't need to put in 1.5C to get back 1.0C. My own tests based on termination at 5º C temperature rise above ambient (see Test 6 on page 7 of http://www.mediafire.com/?2cnw1u9z2axnxe0) indicate that a NiMH will be fully charged when about 100mAh more than what was taken out is put back in. And I suspect most of that is because of the thermal delay inherent in the battery. i.e. it takes time for the temperature rise that indicates full charge to heat the entire battery and be detectable. For a 2000 mAh Eneloop, 100 mAh is only 5%, meaning an Eneloop that is terminated at the optimal time will have a "capacity efficiency" of at least 95%. I'm keen to see technical data that proves otherwise. The difficulty is, ensuring that termination takes place when it should, which is never a problem for Li-ions, thus their near-perfect "capacity efficiency".

But what I have called "capacity efficiency" is not the real efficiency, which should be calculated as Pout/Pin. That would be quite difficult to measure because you'd have to calculate how much energy is going in (integrate Vin x Iin over time for the entire charge period) and do the same for discharge. I have no idea what the energy efficiency would be. A guess would be that at a charge rate of 0.1C, the average charge voltage might be 1.4 volts, yet on discharge the average voltage would be 1.2 volts. Assuming "capacity efficiency" to be 100%, energy efficiency = 1.2/1.4 = 86%.

Li-ions must suffer from the same effect, because their discharge voltage is lower than the charge voltage. I suspect they would perform even worse because NiMHs give out most of their energy on a gently-sloping 1.2 volt plateau, whereas Li-ions have a stepper discharge characteristic.

Maybe someone can direct me to technical data regarding energy efficiencies of NiMH and Li-ion.
 

guyburns

Newly Enlightened
Joined
Apr 13, 2011
Messages
27
REI is a big camping outfit in the US, and their site is rei.com. I just searched there for solar chargers, and found this link… Maybe something on that page is interesting.

Something on that page was interesting. They listed Goal Zero and Brunton solar panels. The Goal Zero stuff has already been covered in previous posts, and the Brunton – well their Solaris 6 looks remarkably like my Sunlinq that has just arrived. I'm going round in circles. Rei sell Brunton, who buy it from Global Solar, who rebrand it Solaris in place of Sunlinq?

I always like to use the original manufacturer's products if possible. I assume Global Solar (who make the Sunlinq brand) and Powerfilm Inc manufacture their own products. Are there any other well-known manufacturers who make consumer panels?
 

guyburns

Newly Enlightened
Joined
Apr 13, 2011
Messages
27
I have chosen two similar-capacity batteries, an Eneloop (2000 mAh) and a Panasonic Li-ion cell (1950 mAh), to work out the energy efficiency because I couldn't find anywhere on the net where this has been done. The original data sheets and the extracted curves (with Photoshop paths included to assist in calculating the area via Histograms) can be downloaded as a zip file from here: http://www.mediafire.com/?p1b57hsjzkd9722 in case someone wants to check my figures.

Note: I didn't perform any tests. All the figures below are from the data sheets in the zip file.

Total Charge Energy
For the Eneloop, charging was done at constant current and a graph of voltage vs time was drawn. To calculate the total energy input (V x I x t = Joules) you have to work out the area under the graph (after suitable scaling). Conditions and total energy input:

T = 25ºC
Charged at 2000 mA
Energy in = 10.2 kJ

For the Li-ion, the charge conditions were:
T = 20ºC
Constant current of 1295 mA, followed by constant voltage of 4.20 volts until the current fell below 100 mA.
Energy in = 28.2 kJ

Discharge Energy
Both batteries were discharged at constant current (a variety of settings) until termination at 1.0 volts for the Eneloop and 2.75 volts for the Li-ion. Voltage was measure and plotted against capacity (not against time). Again, the total energy can be calculated by working out the area under the graph (after suitable scaling).

One interesting thing about these discharge graphs which I didn't realise: because the horizontal axis is capacity, it can be shown that each unit area of the graph represents the same amount of energy. So a graph that has more area under it is giving out more energy. The graphs show that as the discharge current is increased, the total amount of energy being delivered by the cell decreases (even though the total mAh output by the various currents might be quite similar).

Efficiencies below (in brackets) are calculated as output/input.

For the Eneloop (25ºC):
• 400 mA gave 8.8 kJ (86%)
• 2000 mA gave 8.0 kJ (78%)

For the Li-ion (20ºC)
• 370 mA gave 26.3 kJ (93%)
• 1850 mA gave 24.8 kJ (88%)

The particular Li-ion cell investigated here is about 10% more efficient than the Eneloop in its conversion of charging energy to final-use energy. The explanation is not that you need to put more mAh into an Eneloop than a Li-ion to get the same output (both cell types are virtually 100% efficient – I think – in their conversion of mAh from input to output); the explanation is that NiMH cells have a higher differential between their charging voltage and their delivery voltage. Energy depends on V, I and t, so the higher the differential, the greater the energy loss.

Advantage Li-Ion?
When you want to ensure the most efficient use of energy (solar charging for example) it looks like Li-ions have a 10% advantage. However, given that 8 Eneloops have a voltage that is close to optimum for direct connection to a solar charger, and that Li-ions would require a charger (and hence incur losses), the 10% advantage may not exist in my situation.

66% Coulometric Efficiency?
Wikipedia, and hence a lot of other websites, have got the wrong end of the stick when they claim that the charge/discharge efficiency of NiMH is 66%. Some websites dress up the error by throwing in the fancy word "coulometric". The mAh efficiency is certainly 66% if you charge at the 10 hour rate for 15 hours, but by doing that you are overcharging the battery.

Unless someone points out errors in the above, I'm going to have to begin editing some Wikipedia articles.
 

VegasF6

Flashlight Enthusiast
Joined
Dec 5, 2007
Messages
1,449
Location
Las Vegas
Personally, I read it here:
http://www.powerstream.com/NiMH.htm
And it claims that efficiency level worsens at faster charge rates, the opposite of what you are saying. If I charge them at a 1C rate it's going to take ~ 1.5 hours. (well, that's the same not worse)

I am somewhat thrown by your conversions to jewels, wouldn't it be better to measure watts instead? Not sure how you could extract these numbers from a graph since they are on a slope.
Luckily for me I can simply let my chargers do the work and tell me how much energy it took to charge each cell. Then, allow them to settle for an hour before testing discharge. This much I can tell you just from the back of my hand though, the ni-mh cells get warm during charging, the li-co don't. That heat is being generated somewhere. I guess by the charger having to overcome the resistance of the cells to charge them. Heat=watts being consumed.
 

SilverFox

Flashaholic
Joined
Jan 19, 2003
Messages
12,449
Location
Bellingham WA
Hello Guyburns,

Let's focus on charge efficiency of NiMh chemistry.

From the Eneloop data sheet, the maximum capacity was obtained by charging at 200 mA for 16 hours. This happens to be the industry standard for charging NiMh chemistry.

This gives us a nominal 2000 mAh cell a charge of 3200 mA, which results in 1.6 times more charge going in than is available to use.

The data sheet also lists the results of charging at 2000 mA. In this case the charge continues for 1.1 hours, giving 1.1 times more charge going in than is available to use.

Based upon this information, you may speculate that charge efficiency is related to charge rate.

At both charging rates, you end up putting more in than you get out. When charging at slow rates, you end up putting a lot more in.

About the best charge efficiency for NiMh chemistry occurs when charging at 2C. At that rate you only need to put about 1.05 times more charge in that is available to use.

You can verify all of this for yourself by running some tests.

Li-Ion chemistry charging efficiency is quite high, and it only starts to drop off as the cell becomes damaged. Due to the algorithm used to charge Li-Ion cells, the charge rate has little influence upon the charge efficiency.

Tom
 

guyburns

Newly Enlightened
Joined
Apr 13, 2011
Messages
27
I want to make myself clear about what I am doing. I have no barrow to push. If certain of my ideas about batteries/chargers are incorrect, and you can point me to authoritative sources that prove I am incorrect, that's great. I'm here to learn about batteries. I would ask the same in return. It has been suggested I run some tests. Over the last five years I have run several hundred tests (see http://www.mediafire.com/?2cnw1u9z2axnxe0). If my test results are wrong, I want to know about it.


Powerstream is probably not an authoritative source. The best source of information would come from the people who make batteries. Duracell has this to say regarding NiMH:
If a timed charge termination is used, a time of 120 percent charge input is recommended with
a backup temperature cutoff of 60°C.​
They also say in the same document:
Higher capacity levels are achieved with a 150 percent charge input, but at the expense of cycle life; long cycle life is attained with a 105 to 110 percent charge input, albeit with slightly lower capacity due to less charge input…

It seems to me that the standard input charge of 1.5C is used to make sure that the cell is fully charged when tested for capacity. Manufacturers want to be able to state the highest possible rating for their cells, so during testing, even if it is highly inefficient and damaging to the cell, they force in 1.5C to make sure the cell is at maximum capacity. And fair enough. That's the way to get maximum out of a cell. But if you put in 1.05C, according to Duracell itself, this will result in "slightly lower capacity". They don't mention the actual figures, but my testing indicates that there is very little loss, around 5%:

• 1.50 C (input) => 1.0 C (output)
• 1.05 C (input) => 0.95 C (output)

The manufacturer will of course choose the first. It gives his battery a higher rating.

I am somewhat thrown by your conversions to jewels, wouldn't it be better to measure watts instead?

A Joule is the base unit of energy. Watts is a derived unit (Joules/sec). A Watt cannot be used to work out the total energy transferred. There is no useful information conveyed by asking: "How many Watts can that battery provide?" There is no real answer – 0.01 Watts for operating an iPod; 10 Watts when operating a Cree LED. But ask: "How much energy does that battery store?" That can be answered.

To use an analogy: the energy stored in food is always expressed in calories (= 4180 J), never in calories/second. A chocolate bar might have 200 calories (836,000 J); it could also be said to provide 836 Watts (for 1000 seconds), or 1000 Watts (for 836 seconds), or 1 Watt (for 9.68 days). See what I mean? Energy is the thing.

Not sure how you could extract these numbers from a graph since they are on a slope.
Energy = Volts x Amps x Seconds. You split the graph into 1 second intervals, find the average of the voltage and current for each interval, then calculate the energy: V x I x 1. Add up all the energy for the period of discharge and you have the total energy transferred. The easier way is to work out the area under the graph. How did I do that? I used Photoshop to calculate the area for me, and then double-checked by visually averaging the graphs.

Luckily for me I can simply let my chargers do the work and tell me how much energy it took to charge each cell. Then, allow them to settle for an hour before testing discharge.
Please test if you have the time. Using 200 mA, charge an Eneloop for 10 hours then test it's capacity; do the same for 11 hrs, 12 hours, 15 hours. I'd be very interested in the results.

This gives us a nominal 2000 mAh cell a charge of 3200 mA, which results in 1.6 times more charge going in than is available to use.
Yes, I agree. But that does not mean you have to put in 3200 mAh to get back 2000 mAh. The cell is being purposely overcharged to make sure it is holding maximum capacity. That's a sensible thing to do while testing. It is not a sensible thing to do in normal use.

Based upon this information, you may speculate that charge efficiency is related to charge rate.
If someone does speculate that, I think they are wrong. Put an Eneloop into your charger, discharge it, recharge it at 200 mA for various times as described above, and test its capacity.

When charging at slow rates, you end up putting a lot more in. About the best charge efficiency for NiMh chemistry occurs when charging at 2C. At that rate you only need to put about 1.05 times more charge in that is available to use. You can verify all of this for yourself by running some tests..
Can you provide a reference for 2C being the best efficiency? I have not see that figure mentioned in spec sheets. For discharge, 2C is definitely the least efficient. The spec sheets clearly show that. I don't know if the same applies for charging.

I would be very interested in the results from anyone prepared to test as described above. My results may be wrong. I have performed extensive testing (see Test 6, p7 of my PDF). After 37 tests using charge rates from 200mA to 1000mA, there was no significant difference in the capacity stored by a variety of batteries. It seems to me that whether the charge current into a NiMH is 100mA, 500 mA or 2000 mA, if you put in 2100 mAh, out will come ~2000 mAh.

That statement of mine may be at variance with what people believe about charging NiMHs. I have a lot of tests which back up that statement. As I have already said though – if my results are wrong, I want to know about it. I'm not here to stick up for my wrong ideas. I'm hear to learn! Truth over false beliefs any day.
 

Mr Happy

Flashlight Enthusiast
Joined
Nov 21, 2007
Messages
5,390
Location
Southern California
Hi guyburns,

On reading through your above posts, I believe your methodology is sound and your conclusions are reasonable. I stop short of saying your conclusions are accurate merely because I have not verified your calculations. However, it seems likely you have done the calculations correctly and your numbers are right.

Personally I would suspect that lower charging rates are more energetically efficient than higher rates. This is because some of the charging power is lost due to internal resistance in the cell, and power losses are proportional to current squared. So if you double the current and halve the charging time, you have doubled the resistance losses. (Four times the power losses over half the time equals twice as much energy lost overall.) The resistance losses are the primary reason that cells deliver less energy when discharged at higher rates. More energy is converted to heat in the cell and less is delivered as useful energy in the external circuit.

Your other observation with NiMH cells about full charging vs partial charging also makes sense. Once an NiMH cell is nearly full, it becomes much harder to store extra charge in it. Therefore more of the charging current is wasted as heat and less is accumulated inside the cell. This is why the cell starts to warm up before it reaches a state of full charge. If you terminate the charging of an NiMH cell as soon as a significant rate of increase in temperature is detected you will probably get maximum charge efficiency at the cost of some lower capacity.

I have an Eneloop charging graph somewhere that shows the temperature change during charging very clearly.
 

Mr Happy

Flashlight Enthusiast
Joined
Nov 21, 2007
Messages
5,390
Location
Southern California
I did a quick calculation of my own using the Eneloop HR-3UTG datasheet.

For charging at 2000 mA at 25°C I calculated the energy supplied was 2980 mWh (= 10.7 kJ).
For discharging at 400 mA at 25°C I calculated the energy provided was 2480 mWh (= 8.9 kJ).

Dividing these numbers gives an efficiency of 83%.

We seem to have a difference in our calculation of the energy in value. I did my calculations by zooming the plot to a large scale, reading points off the graph with a cursor and plotting the data in Excel. I used Excel to calculate the area under the graph using the trapezoidal rule.
 

VegasF6

Flashlight Enthusiast
Joined
Dec 5, 2007
Messages
1,449
Location
Las Vegas
The resistance losses are the primary reason that cells deliver less energy when discharged at higher rates. More energy is converted to heat in the cell and less is delivered as useful energy in the external circuit.

Your other observation with NiMH cells about full charging vs partial charging also makes sense. Once an NiMH cell is nearly full, it becomes much harder to store extra charge in it. Therefore more of the charging current is wasted as heat and less is accumulated inside the cell. This is why the cell starts to warm up before it reaches a state of full charge. If you terminate the charging of an NiMH cell as soon as a significant rate of increase in temperature is detected you will probably get maximum charge efficiency at the cost of some lower capacity.
You have shown that even at a fixed level of resistance power losses would be greater at higher charge rates. But, even more so, due to exothermic chemical reactions inside the nimh cell, resistance increases while charging increasing the losses exponentially.

As to partial vs full charging of the cell, same is true for lithium cell. The first stage of the charge, the current limiting stage is by far the easiest and the amount of time it takes to complete is a direct corollary of input current. Very nearly 1 to 1. However the constant voltage stage is of course the hard part.
But, is it fair to compare charge efficiencies between a fully charged li-co cell and a mostly charged ni-mh?
 

VegasF6

Flashlight Enthusiast
Joined
Dec 5, 2007
Messages
1,449
Location
Las Vegas
A Joule is the base unit of energy. Watts is a derived unit (Joules/sec). A Watt cannot be used to work out the total energy transferred. There is no useful information conveyed by asking: "How many Watts can that battery provide?" There is no real answer – 0.01 Watts for operating an iPod; 10 Watts when operating a Cree LED. But ask: "How much energy does that battery store?" That can be answered.

Energy = Volts x Amps x Seconds. You split the graph into 1 second intervals, find the average of the voltage and current for each interval, then calculate the energy: V x I x 1. Add up all the energy for the period of discharge and you have the total energy transferred. The easier way is to work out the area under the graph. How did I do that? I used Photoshop to calculate the area for me, and then double-checked by visually averaging the graphs.

Please test if you have the time. Using 200 mA, charge an Eneloop for 10 hours then test it's capacity; do the same for 11 hrs, 12 hours, 15 hours. I'd be very interested in the results.

Well, I will say despite you spoon feeding me, the whole "joule thing" is still a bit out of my comfort level. It is clear you need a comparable unit of energy to compare how much energy is being put in vs how much is going out. From a laymans point of view, it seems your definition of what a joule is is no different than watt hours or kilowatt hours.

The testing, that does sound interesting, but it does sound pretty labor intensive too. For one thing, my charger isn't really set up to charge based on time. Though, I imagine I could go into the menu of the iCharger and set a safety timer for 10 hours, and let that terminate the charge. And graph it out. I have to admit I am not extremely well versed in its workings, but it should work. Tell you what, I won't make any promises, my time is valuable to me but I will try :) You don't mention a discharge current, I guess for this point it doesn't matter? Say, an arbitrary number like 500mA or even 1A

When you talk about charge input, such as "It seems to me that the standard input charge of 1.5C is used to make sure that the cell is fully charged when tested for capacity." What do you mean exactly? Do you mean charging current, or are you talking about the amount of energy introduced to the cell? I don't see how you can make a blanket statement that it is damaging to the cell. The white papers I have read pretty much all say nimh cell can take a 1/10C charge rate indefinitely with no damage. At some point, it becomes in-efficient (is that a word? Did I use it right?) but not damaging.
 

Mr Happy

Flashlight Enthusiast
Joined
Nov 21, 2007
Messages
5,390
Location
Southern California
You have shown that even at a fixed level of resistance power losses would be greater at higher charge rates. But, even more so, due to exothermic chemical reactions inside the nimh cell, resistance increases while charging increasing the losses exponentially.
It may seem counter-intuitive, but the opposite is true. The resistance of electrochemical systems tends to decrease as the temperature rises for two reasons. Firstly, the chemical reactions of the ions at the electrodes can proceed faster (chemical reactions speed up at higher temperatures), and secondly the speed at which the ions can move between the electrodes increases (the ions become more mobile at higher temperatures). These two factors are what cause the voltage drop at the end of charging of an NiMH cell.

Charging and discharging of NiMH cells is therefore more efficient when the cell is warm. That's why RC racers liked to use their NiMH cells when hot off the charger. Hot cells are observed to work better than cold cells.

As to partial vs full charging of the cell, same is true for lithium cell. The first stage of the charge, the current limiting stage is by far the easiest and the amount of time it takes to complete is a direct corollary of input current. Very nearly 1 to 1. However the constant voltage stage is of course the hard part.
But, is it fair to compare charge efficiencies between a fully charged li-co cell and a mostly charged ni-mh?
It is not really true to say the CC part of lithium ion charging is "easy" and the CV part is "hard". The difference in rate of charging is externally imposed by the charger and is not caused by the cell becoming resistant to charge. The reduction in charge rate during the CV phase is imposed to ensure the maximum cell potential is not exceeded in order to avoid damage to the cell.

Concerning joules, watt-hours are essentially the same thing. One joule equals one watt-second, equals 1/3600 of a watt-hour. In the battery world, mWh is certainly a more familiar unit.

The way to do constant current charging with a programmable charger is to choose a current that is too low for the cell to reach the termination condition. So for example, with a 2000 mAh Eneloop, you could set it to charge at a rate of 500 mA and set the safety timer to 4 hours (or maybe 3.5 hours if you can do that). Or charge at 400 mA for 4 hours. Then graph the result. If you do this, do it for interest and education, not because you have to.

Yes, it is true that the standard charge of 1.6 C is done to make sure the cell is charged as much as it can possibly be, even though this is wasteful. It means for instance, applying 3200 mAh to a 2000 mAh cell, wasting at least 1200 mAh x 1.5 V = 1800 mWh = 6.5 kJ as heat. Does it damage the cell? Yes and no. There is no sharp dividing line between "damage" and "wear". The more you use (or abuse) a cell, the more you use up its life expectancy. Every time you charge and discharge a cell you have subtracted one cycle from its life. A standard charge probably counts for more than one "standard" cycle by a significant margin, when compared to a 0.5C fast charge with good termination.
 
Last edited:

VegasF6

Flashlight Enthusiast
Joined
Dec 5, 2007
Messages
1,449
Location
Las Vegas
It may seem counter-intuitive, but the opposite is true. The resistance of electrochemical systems tends to decrease as the temperature rises for two reasons. Firstly, the chemical reactions of the ions at the electrodes can proceed faster (chemical reactions speed up at higher temperatures), and secondly the speed at which the ions can move between the electrodes increases (the ions become more mobile at higher temperatures). These two factors are what cause the voltage drop at the end of charging of an NiMH cell.

Interesting, I hadn't thought of it that way. Thanks for explaining.

It is not really true to say the CC part of lithium ion charging is "easy" and the CV part is "hard". The difference in rate of charging is externally imposed by the charger and is not caused by the cell becoming resistant to charge. The reduction in charge rate during the CV phase is imposed to ensure the maximum cell potential is not exceeded in order to avoid damage to the cell.

I am not sure I am completely following you here. What I should have said perhaps is the current limited phase can be sped up depending on charge rate. The voltage limited phase, which makes up the majority of the charge cycle, can't. It is more a law of physics if one chooses to follow the standard charging profile for lithium battery than it is a limitation of the charger itself.

Good thread by the way, I am learning from it, and isn't that the goal?
 

Mr Happy

Flashlight Enthusiast
Joined
Nov 21, 2007
Messages
5,390
Location
Southern California
I am not sure I am completely following you here. What I should have said perhaps is the current limited phase can be sped up depending on charge rate. The voltage limited phase, which makes up the majority of the charge cycle, can't. It is more a law of physics if one chooses to follow the standard charging profile for lithium battery than it is a limitation of the charger itself.
It is neither a law of physics, nor a limitation of the charger. It is what one chooses to do. Purely free choice. Words like easy or hard are not appropriate here. It is as easy to overcharge a lithium ion cell as it is to follow the recommended charging algorithm, maybe easier, even.

You could, if you chose, speed up the charging of a lithium ion cell by charging at constant current until it became fully charged. What is difficult about that (difficult in the sense of complication, not difficult in the sense of resistance to charging) is that it becomes hard to know when to stop charging. You must stop charging when the resting open circuit voltage of the cell reaches 4.2 V. But while you are applying charging current you cannot measure the resting voltage. Therefore you must either charge in pulses and wait for the cell to rest between pulses so you can measure the voltage, or you must apply a constant voltage in which case the resting voltage can never exceed the voltage you apply.

It is simplest and most convenient to use the CC/CV algorithm therefore, although various chargers do sometimes attempt the pulsed approach. Many people argue about this as you will have observed, so it is easiest just to stick with CC/CV and avoid contention.
 

KiwiMark

Flashlight Enthusiast
Joined
Oct 19, 2008
Messages
1,731
Location
Waikato, New Zealand
Lots of interesting input - but I still stand by my idea of using a 12V SLA or LiFePO4 equivalent. You can get a LiFePO4 motorcycle battery that weighs less than 2 lb but has 14Ah of capacity, you can get solar cells designed to charge 12V batteries, you can get chargers designed to work from 12V power - I think this would work easily without having to 'jury-rig' or 'kludge' anything.
Also: You could start with freshly charged Eneloops & a freshly charged 14Ah LifePO4 motorcycle battery - even if you aren't getting as much power each day from the sun as you are using it will take a while before you are out of battery power.

The only problem is that the LiFePO4 batteries are pretty expensive and the SLA batteries are pretty heavy.
 

guyburns

Newly Enlightened
Joined
Apr 13, 2011
Messages
27
So if you double the current and halve the charging time, you have doubled the resistance losses. (Four times the power losses over half the time equals twice as much energy lost overall.)
Applies to a wide variety of phenomena, and partly explains why increasing your speed in a car uses more petrol/distance. Even though you get there quicker, you use more petrol to overcome the wind and other resistances (probably square law losses).

I did a quick calculation of my own using the Eneloop HR-3UTG datasheet.

For charging at 2000 mA at 25°C I calculated the energy supplied was 2980 mWh (= 10.7 kJ).
For discharging at 400 mA at 25°C I calculated the energy provided was 2480 mWh (= 8.9 kJ).

Good on y'mate, for doing the calculations and checking up on me. Yours are just as accurate as mine (10.2 and 8.8) – maybe more accurate (who knows) given that the thickness of the graph lines would not allow accuracy better than about 5%.
 

guyburns

Newly Enlightened
Joined
Apr 13, 2011
Messages
27
From a laymans point of view, it seems your definition of what a joule is is no different than watt hours or kilowatt hours.
I can't claim it as my definition; V x I x t is how you calculate electrical energy. The smaller you make the time increments, the more accurate the answer is. And you are partly correct about a Joule being the same as Watt-hours and kW-hrs:

Watt-hr = 3600 J
kW-hr = 3.6 MJ

It's also similar to a lot of other units: foot-poundals, BTU, horsepower-minutes… Why American still stick with medieval units has got me beat!

You don't mention a discharge current, I guess for this point it doesn't matter? Say, an arbitrary number like 500mA or even 1A?
It does matter a little bit: the higher the charge current, the less output capacity. Since I used the Eneloop discharge curve at 400mA in my calculations of energy out, for the sake of being consistent, use 400 mA when discharging, no matter what the charge rate.

When you talk about charge input, such as "It seems to me that the standard input charge of 1.5C is used to make sure that the cell is fully charged when tested for capacity." What do you mean exactly? Do you mean charging current, or are you talking about the amount of energy introduced to the cell?
C is the battery capacity measured in Amp-hrs (or mAh). It is not equivalent to energy or current, but by convention is used to define current when charging and discharging batteries. For example, C for an Eneloop is 2000mAh. If you charge an Eneloop and put in 3000mAh, that is equal to 1.5C. You can also use C to specify current. For example, if you charge an Eneloop at 1C, that means 2000mA; 0.1 C means 200mA. But it is not strictly correct to do so, because 0.1C is actually 200mAh, and you can't have a current of 200mAh. However, even though incorrect, the term is widely used and accepted when applied to current for charging/discharging batteries. And it can be confusing until you get used to the convention.

I don't see how you can make a blanket statement that it is damaging to the cell. The white papers I have read pretty much all say nimh cell can take a 1/10C charge rate indefinitely with no damage. At some point, it becomes in-efficient (is that a word? Did I use it right?) but not damaging.
You are probably correct, although I'll stay out of an argument about the complexities of when cells are damaged. How do you define damage? Charging at 0.1C for 11 hours is being very kind to a cell and will give it long life and almost full capacity. Who would notice the difference between 1.0C and 0.95C in everyday use? Duracell says to limit the time to 12 hours. When testing a cell or to guarantee full charge, the time is typically stated as 15 hours. My view is that anytime a cell gets warm, or is overcharged, it is being damaged. I don't know to what degree, but heat can cause damage. Unless I have a good reason to charge a battery quickly, I tend to stick with putting back 1.1 C into a cell at between 0.1 and 0.2 C. Better still, I use the charger I designed to terminate charging at 5º C temperature rise, then I don't have to worry about the exact value of C. It charges every AA kindly, no matter what the actual capacity or what charge rate I use.

On the other hand, I reckon that anything other than excessive temperature rise on a regular basis is not going to reduce the cell life in any meaningful way. Batteries get lost, dropped, corrode, decay with age, left on accidentally for 2 days at 0.1C… whatever. What does it matter anyway if a $3 cell only lasts 298 cycles instead of 467? Still, I like to be kind to my batteries, and I consider overcharging, no matter to how small a degree, should be avoided.
 

guyburns

Newly Enlightened
Joined
Apr 13, 2011
Messages
27
Lots of interesting input - but I still stand by my idea of using a 12V SLA or LiFePO4 equivalent. You can get a LiFePO4 motorcycle battery that weighs less than 2 lb but has 14Ah of capacity, you can get solar cells designed to charge 12V batteries, you can get chargers designed to work from 12V power - I think this would work easily without having to 'jury-rig' or 'kludge' anything.
Also: You could start with freshly charged Eneloops & a freshly charged 14Ah LifePO4 motorcycle battery - even if you aren't getting as much power each day from the sun as you are using it will take a while before you are out of battery power. The only problem is that the LiFePO4 batteries are pretty expensive and the SLA batteries are pretty heavy.

Thanks Mark for reiterating your suggestion. Keep them coming. I've looked into it, and LiFePO4 batteries are looking good. The only problem with the 12-volt packs would be the reduced power out of the solar panel. Four cells in series would require 15.2 volts at termination (3.8 volts per cell), and somewhere above 12 volts even when completely discharged. And this voltage range is above the optimum power output of the Sunlinq panel. I'm in the middle of testing the Sunlinq. Not a good time of the year for testing solar power in Tasmania – the max sun altitude today was 37º, but come summer time (I don't go out into the wilds in winter), when the sun reaches up to about 70º at midday, I expect the power figures to double.

The open-circuit and short-circuit outputs were: Voc = 15.0, Isc = 460 mA.

060 mA … 14.3 V (0.86 W)
100 mA … 13.7 V (1.37 W)
150 mA … 12.1 V (1.82 W)
200 mA … 10.5 V (2.10 W)
250 mA … 08.9 V (2.23 W)
300 mA … 07.3 V (2.19 W)
350 mA … 05.5 V (1.90 W)
400 mA … 03.6 V (1.44 W)

Every Watt counts for this exercise, and you can see that above 13 volts the panel's output drops off rapidly. The highest power out is between about 7 and 11 volts, so I want to try and match that. Putting the same amount of energy into a 12-volt pack from a solar panel will take significantly longer than into a 6 or 9-volt pack. My guess is something like twice as long. i.e. for the same daylight hours, I'll only get half the energy into a 12-volt pack as for a 6 or 9-volt pack.

I came across two very good possibilities for batteries:

1. http://www.batteryspace.com/lifepo4...hsquare1728wh70aratewithpcbandpolyswitch.aspx
2. http://www.batteryspace.com/lifepo4...flatpack2592wh8aratewithpcbandpolyswitch.aspx

Both cost about $40, and what makes them very good candidates is that they come with comprehensive protection circuitry, and they are both within the range of maximum power output. The first is equivalent in capacity to 8 Eneloops, the second to 12 Eneloops. I could charge them straight from the solar panel if I was game enough to let the protection circuitry handle over voltage (the cells should be charged to 3.80 volts per cell, but the protection comes in at 3.85). I can't see that the slight difference would be important when charging from a solar panel. But it might be. So I might need a voltage-clamp circuit on the output of the panel to limit the maximum voltage: 7.60 volts or 11.40 volts, depending on which pack I chose.

I wouldn't need the typical constant-current circuit used to initially charge Li-ions because the solar panel is current limited anyway (~460 mA).

The advantage of the 6.4 volt pack is that I could charge 2 Eneloops using the nifty Sanyo USB charger. It may work from a higher voltage, but I'm not sure yet. The disadvantage of both is that to charge the camera's Li-ion cells I would need a step-up DC-DC converter to get 12 volts.

And you are perfectly correct about "jury rigs" and "kludging". I don't want a mess of wires or rubbishy-looking home-made boxes. Whatever I put together will be a joy to look at, elegant and efficient. The Sunlinq certainly looks the part.
 
Top