# mAh vs. mWh



## Viking (Nov 13, 2015)

I'm been told that in some electronic devices mAh will be the best ways to compare useful battery capacity from one battery to the next, and in other electronic devices mWh will be best suited.

I guess it depends of discharge method of the particular device ( correct me if I'm wrong ).
As I understand there are three different discharge methods: Constant Resistance, Constant Current and Constant Power.

Under which of these methods are respectively mAh and mWh best suited for the task.


----------



## SemiMan (Nov 14, 2015)

Viking said:


> I'm been told that in some electronic devices mAh will be the best ways to compare useful battery capacity from one battery to the next, and in other electronic devices mWh will be best suited.
> 
> I guess it depends of discharge method of the particular device ( correct me if I'm wrong ).
> As I understand there are three different discharge methods: Constant Resistance, Constant Current and Constant Power.
> ...



Depends entirely on the load.

If the driver is a linear constant current driver, then essentially it is controlling the current directly coming from the battery, hence mAh is best. If you have a buck or boost driver, then the output power is usually fixed, and efficiency is usually somewhat consistent, hence mWh is a better measurement.

Semiman


----------



## Viking (Nov 14, 2015)

"buck or boost driver" meaning constant power discharge right ?


----------



## SemiMan (Nov 14, 2015)

Viking said:


> "buck or boost driver" meaning constant power discharge right ?



Technically constant output current, but yes, output power will be pretty constant too with only temperature effect on LED voltage causing a change.


----------



## Rick NJ (Nov 14, 2015)

Viking said:


> I'm been told that in some electronic devices mAh will be the best ways to compare useful battery capacity from one battery to the next, and in other electronic devices mWh will be best suited.
> 
> I guess it depends of discharge method of the particular device ( correct me if I'm wrong ).
> As I understand there are three different discharge methods: Constant Resistance, Constant Current and Constant Power.
> ...



When I faced the exact same question you have, I did a lot of reading. I came to realize that there is no standard mAH rating method. Some manufacturers do it one way, some other manufacturers do it another way and not all of them publish the way they determine their mAH. I came to reluctantly pretend that they are universal and use the publish mAH as estimation of their potential capacity.

Truth is, every battery is different. Even those came off the same lot will not have the exact same mAH. With the same battery, mAH @ 100mA would be radically different from mAH @ 2A. So a simple WH or mAH number is a gross simplification and will not be adequate in judging which one would run longer.

In my view, you need two dimensional graph to determine run time. Total mWH on Y axis and drain current on the X axis. Using the expected range of current requirement, you can read from the graph the expected wattage you can get out of it.

Of course, plotting the graph will use up a lot of recharge cycles on the battery, so I do only three data points:
500mA, 1A, 2A. So depending on expected current, I know how long that specific battery would probably last. Interestingly, of my 12x18650, my best mAH battery at 500mA is also my worst battery at 2A.


----------



## Viking (Nov 14, 2015)

SemiMan said:


> Technically constant output current, but yes, output power will be pretty constant too with only temperature effect on LED voltage causing a change.



Now I'm a little confused :thinking:

For a constant current discharge method, mAh is best suited. I think I got that. 

what about constant power, and Constant Resistance. Is mAh or mWh best suited, or does it depend on other things too.


----------



## mattheww50 (Nov 14, 2015)

It is going to depend upon the application. When you are dealing with an energy source that is essentially constant voltage such as NiMh or NiCd batteries in the end it makes very little difference whether you measure the capacity in watt hours or amp hours, Voltage is almost constant throughout the discharge, so there is a straight line relationship between the two. Once you leave the realm of constant voltage sources (such as LiIon batteries), the picture gets a lot murkier. If the device uses a linear driver, then the measurement will have to be amp hours to be meaningful. If I have a 12 volt 2 Ah supply that feeds a linear driver that outputs 2 volts at 2 amps, no matter what the watt hour capacity is, the run time will be an hour because the linear driver supplies constant current, and simply wastes excess voltage as required. So a 12V 2Ah supply has 24Wh capacity, but gives the same run time as a 2 Volt 2 Ah supply. In this case the Watt hour capacity is almost meaningless because the linear driver is just wastes any voltage in excess of what is required.

Boost and true Buck drivers for example are basically constant wattage output devices with a known conversion efficiency. If you need 4 watts output as in my previous example, it is going to take something a little in excess of 4 watts input regardless of the supplied voltage. So a fully charged LiIon is going to need to take about a amp at 4.2 volts, however when the battery has been run down to say 3.2 volts, that same driver will draw about 1.25 amps. If we carry it even further and compare the difference between a driver fed with a pair of CR123A's in series versus a single 18650, our 4 watts will need about 1 amp from the 18650, but only about 670 mA from the pair of CR123's. Having said that, the conversion efficiency does tend to decline with failing supply voltage. I.E. The efficiency of the driver is likely to be higher with the pair of CR123A's than it will be with the single 18650.

In the first case, mAh and mWh will give you the same answer, in the 2nd case, mAh will tend to overstate the useful capacity, and in the 3rd case, mAh will grossly understate the capacity for the CR123 based supply compared to the LiIon.


----------



## idleprocess (Nov 14, 2015)

Each cell chemistry has a nominal voltage: 1.2V NiCd/NiMH, 2V Lead-Acid, 3.3V LiFeP04, 3.6V Li-Ion/Poly; knowing this you can use the rated amp-hours to get an idea of capacity by performing simple multiplication: a 2600mAH Li-Ion cell is 9.36W-H; a 2000mAH NiMH cell is 2.4 W-H. This is the method I see used most by manufacturers who rate battery watt-hours: a 54 W-Hr 14.4V laptop battery uses 3.75 A-H cells; a 72 watt-hour 18V power tool battery even _advertises_ that it uses 4 A-H cells. Some manufacturers might use a more realistic method to determine watt-hours using more sophisticated integration of the voltage curve under standardized design-case loads, however I've just not encountered any in the consumer and commodity-product arena.

When comparing substantially similar chemistries (i.e. standby power rated lead-acid, low-self-discharge NiMH, 20C rated Li-Poly), industry standard seems to be amp-hours since watt-hours just adds a moderate layer of obfuscation. It's the data sheets (provided they're reliable) that will show differences: if two different models of similar products provide semi-standardized test data, one might perform better than the other in particular circumstances.

Note that capacity ratings typically assume a fairly low current draw of fractional *C* _(the rate that should drain a cell in an hour: 2 amps for a 2000mAH cell, 500 milliamps for a 500mAH cell, etc)_. Drawing more the test current will typically result in lower delivered amp-hours while drawing less will result in higher delivered amp-hours.

As to your specific question, you should look at the data sheets for cells of interest to see if the data has scenarios that fit your use case. Energizer has quite the collection of data sheets with the scenarios you describe - constant current/resistance/power - although sadly their testing for NiMH cells appears to be more cursory.


----------



## Viking (Nov 14, 2015)

Rick NJ said:


> When I faced the exact same question you have, I did a lot of reading. I came to realize that there is no standard mAH rating method. Some manufacturers do it one way, some other manufacturers do it another way and not all of them publish the way they determine their mAH. I came to reluctantly pretend that they are universal and use the publish mAH as estimation of their potential capacity.
> 
> Truth is, every battery is different. Even those came off the same lot will not have the exact same mAH. With the same battery, mAH @ 100mA would be radically different from mAH @ 2A. So a simple WH or mAH number is a gross simplification and will not be adequate in judging which one would run longer.



I'm not referring to producers mAh ratings. I know they aren't always trustworthy.
I also know that using different discharge rates will influence the measured capacity.

I'm talking about if the discharge is done in the exact same manner.


----------



## Viking (Nov 14, 2015)

mattheww50 said:


> It is going to depend upon the application. When you are dealing with an energy source that is essentially constant voltage such as NiMh or NiCd batteries in the end it makes very little difference whether you measure the capacity in watt hours or amp hours, Voltage is almost constant throughout the discharge, so there is a straight line relationship between the two. Once you leave the realm of constant voltage sources (such as LiIon batteries), the picture gets a lot murkier. If the device uses a linear driver, then the measurement will have to be amp hours to be meaningful. If I have a 12 volt 2 Ah supply that feeds a linear driver that outputs 2 volts at 2 amps, no matter what the watt hour capacity is, the run time will be an hour because the linear driver supplies constant current, and simply wastes excess voltage as required. So a 12V 2Ah supply has 24Wh capacity, but gives the same run time as a 2 Volt 2 Ah supply. In this case the Watt hour capacity is almost meaningless because the linear driver is just wastes any voltage in excess of what is required.
> 
> Boost and true Buck drivers for example are basically constant wattage output devices with a known conversion efficiency. If you need 4 watts output as in my previous example, it is going to take something a little in excess of 4 watts input regardless of the supplied voltage. So a fully charged LiIon is going to need to take about a amp at 4.2 volts, however when the battery has been run down to say 3.2 volts, that same driver will draw about 1.25 amps. If we carry it even further and compare the difference between a driver fed with a pair of CR123A's in series versus a single 18650, our 4 watts will need about 1 amp from the 18650, but only about 670 mA from the pair of CR123's. Having said that, the conversion efficiency does tend to decline with failing supply voltage. I.E. The efficiency of the driver is likely to be higher with the pair of CR123A's than it will be with the single 18650.
> 
> In the first case, mAh and mWh will give you the same answer, in the 2nd case, mAh will tend to overstate the useful capacity, and in the 3rd case, mAh will grossly understate the capacity for the CR123 based supply compared to the LiIon.



Can you specify the cases where respectively mAh and mWh are best suited a little more precise. I'm confused since I don't get it to only three cases you see :thinking:

If I understand correctly, somebody correct me if I'm wrong.
In devices using constant current mAh is best suited for comparison, and In devices using constant power mWh is best suited ?

And for chemistries with flat discharge curve like nimh, it doesn't really matter whether using mAh or mWh


----------



## Timothybil (Nov 14, 2015)

Simplest case: if comparing cells of similar voltage a simple mAh comparison works just fine. Example: comparing multiple NiMH cells. Since they are all a nominal 1.2v, a direct comparison of mAh ratings should give an ideal of relative performance. This is always assuming reasonable accurate mAh ratings from the manufacturer. 

When comparing cells of different voltage ratings, mWh is the only way to go. Example: Comparing an NiMH cell at 1,2 volts and 2500 mAh to a LiIon cell at 4.2v, and 3200 mAh, the only way to accurately compare is by converting both to mWh. So if one had a flashlight that could use 4 AA or one 18650 with the capacity ratings given, a mWh comparison would show that the two setups would give very similar mWh ratings. 

This, of course, in not taking into consideration driver type, efficiency, and current draw from different cell types, but it does give a quick rough indication of power available for each configuration.


----------



## Viking (Nov 14, 2015)

I'm mostly thinking of batteries of same type and chemistry.
batteryshowdown that measures both in mAh and mWh for there 1.5 volts battery tests says as follows on the matter:



> *mAh vs mWh – the difference*
> 
> 
> The majority of batteries which have a rating (such as rechargeables) are rated in milliamp-hours, or mAh. 1000 mAh represents the equivalent energy load of 1000 milliamps applied for one hour. However, the issue with mAh is it fails to account for falling battery voltage. In general, a digital camera will use a switching converter, which will get more life with a higher mWh battery. But a torch, toy or similar device is fairly independent of voltage, and will last longer with a higher mAh battery. We provide both in the results tables so they can be compared.
> ...



A switching converter sounds like constant power to me, but I don't seem to could get a straight answer wheter that is the case or not. Maybe it's because the answer is more complex than just that, I don't know.


And here is what alkalines.eu says among other tnhings. 
Are they referring to constant current in this statement ?



> in some cases, mAh provide a better estimate of the battery ability to power your device:
> Devices using a voltage regulator: these devices use a voltage regulator to decrease the voltage provided by the batteries to a value supported by the electronics. This means that as long as the voltage from the batteries is high enough, the current consumed by the electronics will be the same as the current provided by the battery. In this specific case we therefore only need to know the mAh a battery can provide above a given cut-off voltage where the voltage regulator stops working (and the batteries must be replaced).
> 
> http://www.alkalines.eu/2014/01/which-metric-to-compute-energy-used-mah.html


----------



## Gauss163 (Nov 14, 2015)

*Explicit context is needed in question and answers*

Beware that some of the answers/excerpts are based on assumptions not made explicit, e.g. the prior quote is assuming a _linear _(vs. switching) regulator. You are getting answers including many such implicit assumptions (e.g. assumed load is a flashlight) because the original question gave no context, so answers are often (implicitly) inferring or imposing context.


----------

