I want to make myself clear about what I am doing. I have no barrow to push. If certain of my ideas about batteries/chargers are incorrect, and you can point me to authoritative sources that prove I am incorrect, that's great. I'm here to learn about batteries. I would ask the same in return. It has been suggested I run some tests. Over the last five years I have run several hundred tests (see
http://www.mediafire.com/?2cnw1u9z2axnxe0). If my test results are wrong, I want to know about it.
Powerstream is probably not an authoritative source. The best source of information would come from the people who make batteries. Duracell has this to say regarding NiMH:
If a timed charge termination is used, a time of 120 percent charge input is recommended with
a backup temperature cutoff of 60°C.
They also say in the same document:
Higher capacity levels are achieved with a 150 percent charge input, but at the expense of cycle life; long cycle life is attained with a 105 to 110 percent charge input, albeit with slightly lower capacity due to less charge input…
It seems to me that the standard input charge of 1.5C is used to make sure that the cell is fully charged when tested for capacity. Manufacturers want to be able to state the highest possible rating for their cells, so during testing, even if it is highly inefficient and damaging to the cell, they force in 1.5C to make sure the cell is at maximum capacity. And fair enough. That's the way to get maximum out of a cell. But if you put in 1.05C, according to Duracell itself, this will result in "slightly lower capacity". They don't mention the actual figures, but my testing indicates that there is very little loss, around 5%:
• 1.50 C (input) => 1.0 C (output)
• 1.05 C (input) => 0.95 C (output)
The manufacturer will of course choose the first. It gives his battery a higher rating.
I am somewhat thrown by your conversions to jewels, wouldn't it be better to measure watts instead?
A Joule is the base unit of energy. Watts is a derived unit (Joules/sec). A Watt cannot be used to work out the total energy transferred. There is no useful information conveyed by asking: "How many Watts can that battery provide?" There is no real answer – 0.01 Watts for operating an iPod; 10 Watts when operating a Cree LED. But ask: "How much energy does that battery store?" That can be answered.
To use an analogy: the energy stored in food is always expressed in calories (= 4180 J), never in calories/second. A chocolate bar might have 200 calories (836,000 J); it could also be said to provide 836 Watts (for 1000 seconds), or 1000 Watts (for 836 seconds), or 1 Watt (for 9.68 days). See what I mean? Energy is the thing.
Not sure how you could extract these numbers from a graph since they are on a slope.
Energy = Volts x Amps x Seconds. You split the graph into 1 second intervals, find the average of the voltage and current for each interval, then calculate the energy: V x I x 1. Add up all the energy for the period of discharge and you have the total energy transferred. The easier way is to work out the area under the graph. How did I do that? I used Photoshop to calculate the area for me, and then double-checked by visually averaging the graphs.
Luckily for me I can simply let my chargers do the work and tell me how much energy it took to charge each cell. Then, allow them to settle for an hour before testing discharge.
Please test if you have the time. Using 200 mA, charge an Eneloop for 10 hours then test it's capacity; do the same for 11 hrs, 12 hours, 15 hours. I'd be very interested in the results.
This gives us a nominal 2000 mAh cell a charge of 3200 mA, which results in 1.6 times more charge going in than is available to use.
Yes, I agree. But that does not mean you have to put in 3200 mAh to get back 2000 mAh. The cell is being purposely overcharged to make sure it is holding maximum capacity. That's a sensible thing to do while testing. It is not a sensible thing to do in normal use.
Based upon this information, you may speculate that charge efficiency is related to charge rate.
If someone does speculate that, I think they are wrong. Put an Eneloop into your charger, discharge it, recharge it at 200 mA for various times as described above, and test its capacity.
When charging at slow rates, you end up putting a lot more in. About the best charge efficiency for NiMh chemistry occurs when charging at 2C. At that rate you only need to put about 1.05 times more charge in that is available to use. You can verify all of this for yourself by running some tests..
Can you provide a reference for 2C being the best efficiency? I have not see that figure mentioned in spec sheets. For discharge, 2C is definitely the
least efficient. The spec sheets clearly show that. I don't know if the same applies for charging.
I would be very interested in the results from anyone prepared to test as described above. My results may be wrong. I have performed extensive testing (see Test 6, p7 of my PDF). After 37 tests using charge rates from 200mA to 1000mA, there was no significant difference in the capacity stored by a variety of batteries. It seems to me that whether the charge current into a NiMH is 100mA, 500 mA or 2000 mA, if you put in 2100 mAh, out will come ~2000 mAh.
That statement of mine may be at variance with what people believe about charging NiMHs. I have a lot of tests which back up that statement. As I have already said though – if my results are wrong, I want to know about it. I'm not here to stick up for my wrong ideas. I'm hear to learn! Truth over false beliefs any day.