Using Li-ion cells in LED flashlights safely

lunas

Enlightened
Joined
Apr 18, 2014
Messages
206
Thanks for the input people...the battery just gets warm, not to the level of being too hot to hold. The flashlight body however, if left tailstanding for 10 minutes (I use it ceiling bounce on max to illuminate a group photo) it does get unbearably hot to hold initially. I guess that's not such a bad thing as it would mean heat is being dissipated from the head throughout the body?
yeah i have a couple lights like that my new thorfire TG06 on high gets warm in about 2 minutes of use. We call these lights hand warmers...

Most high power led lights will heat up very hot. All my xml and both my xpg2 get hot it is when one of these 300+ lumen lights don't make any noticeable heat you need to worry as it means the led are thermally isolated from the body of the light. The confined heat will eventually burn out the led or if the driver is good it will throttle.

an example of throttling is the fenix e99 ti or the fenix e05 2014 model will only stay in high mode for 3 minutes then kick down to medium.
 
Last edited:

JLB

Newly Enlightened
Joined
Oct 8, 2015
Messages
3
Thanks for the information. I have left my 18650's in my chargers overnight. I have a Pila and thought it was safe to leave them in there. I guess no more.
 

recDNA

Flashaholic
Joined
Jun 2, 2009
Messages
8,761
I feel the need to often touch batteries for signs of heat. I would never leave it alone for 15 minutes. I don't care what the charger or battery is. One of these days I need to buy an infrared thermometer so I can more easily check heat of charging cells.
 

lumensearch

Newly Enlightened
Joined
Oct 19, 2015
Messages
9
Evening guys

I'm looking for some average weights of 18650's, against capacity....any pointers?
 

sidecross

Flashlight Enthusiast
Joined
Jul 29, 2012
Messages
1,369
I have 40 18650 batteries bought in groups of ten over the last 20 months. Each group is separate by manufacturer and mAh ratings. I have 10 Eagletac 3100mAh, 10 Keeppower 3100mAh, 10 Eagletac 3400mAh, and 10 Keeppower 3400mAh. It is easy to keep these batteries in groups for my lights that use 4 18650's and 2 18650's.
 

Flashy808

Enlightened
Joined
Jul 2, 2015
Messages
282
What voltage should 14500 Li-ion batteries be charged to? From what I can see it is 4.2 volts (as mentioned it multiple places on CPF) but on the actual battery itself it says it is rated for 3.7 volts.

Can someone please help clarify? :thinking::thinking:

Oh and BTW if those 14500 batteries were only supposed to be charged to 3.7v I may have made a little boo boo :oops:.

Thanks!
 

lunas

Enlightened
Joined
Apr 18, 2014
Messages
206
What voltage should 14500 Li-ion batteries be charged to? From what I can see it is 4.2 volts (as mentioned it multiple places on CPF) but on the actual battery itself it says it is rated for 3.7 volts.

Can someone please help clarify? :thinking::thinking:

Oh and BTW if those 14500 batteries were only supposed to be charged to 3.7v I may have made a little boo boo :oops:.

Thanks!
3.7 is the nominal voltage they will drop to under load 4.2 is the max voltage +-5% they should rest at. 3.0 is the absolute minimum they should be allowed to be discharged to most protection will turn off at 3.3. There are some that go to 2.8 but those make people round here nervous... And of course im talking individual cells not packs.

Also with a good charger the charger will charge them at 350mA or 750mA a few will actually fast charge at over 1A to 3.7v then trickle charge them the final bit to 4.2 before turning off.

In general a battery can be charged at 1c there are a few that do 2c charging but those are typically Lipo the c rating of the battery is used to describe the discharge they can do and the charge rate they can do a 14500 is typically between 600mAh and 1200mAh the 1200mAh are usually fakes and 700 or 750 is more likely a cell that is 750mAh and can charge at 1c means it can charge at 750mA this should charge one of these cells very quickly. Pushing tolerance is what breeds disaster and pushing 750mA in to a battery that maxes out at 750 is a good way to vent them even if they claim they can take it.
 

HKJ

Flashaholic
Joined
Mar 26, 2008
Messages
9,715
Location
Copenhagen, Denmark
3.7 is the nominal voltage they will drop to under load 4.2 is the max voltage +-5% they should rest at.

No, they should be charged at, rest will usual be lower.


3.0 is the absolute minimum they should be allowed to be discharged to most protection will turn off at 3.3. There are some that go to 2.8

Protection usual trips between 2.0 volt and 2.5 volt (I did some test here: http://lygte-info.dk/info/DischargeProtectionTest UK.html ).
Minimum voltage during discharge is usual rated at 2.5, 2.75 or 3.0 volt, depending on the cell, resting voltage will usual be well above 3 volt.
 

Flashy808

Enlightened
Joined
Jul 2, 2015
Messages
282
Oh I think I understand now. So you charge all cells to 4.2 volts but after letting it rest for about [how long??] should bring it back down to its nominal voltage?

If this correct then should I let the battery rest until it reaches 3.7v before resuming use or just jump straight in?

& am I correct in assuming Nominal Voltage= Voltage at when rested and what it is normally suppose to produce?

Thanks
 
Last edited:

thedoc007

Flashlight Enthusiast
Joined
Feb 16, 2013
Messages
3,632
Location
Michigan, USA
Oh I think I understand now. So you charge all cells to 4.2 volts but after letting it rest for about [how long??] should bring it back down to its nominal voltage?

If this correct then should I let the battery rest until it reaches 3.7v before resuming use or just jump straight in?

& am I correct in assuming Nominal Voltage= Voltage at when rested and what it is normally suppose to produce?

Thanks I would have never known!


Heh, no, you don't have it yet.

Nominal voltage has little or nothing to do with resting, or charging, or loading a cell. A typical li-ion works from 4.2 down to 2.5 volts or so - the nominal voltage is just an arbitrary value somewhere in that range. That's why you see some manufacturers list 3.6, and some list 3.7 volts as the "nominal" value. Neither is empirically more accurate...the nominal voltage is there only as a general guideline. I.e., you know not to put a 3.7 volt cell in a light that has a range of 0.9 to 2 volts, for example. But a (nominal) 3.7 volt cell can be 2.9 volts under load, or 4.1 volts at rest, or 3.1 volts at rest, or 3.8 volts under load. Note that this "nominal" voltage applies to NiMH, lithium, alkalines, and every other kind of cell, not just li-ion. None of them maintain exactly the same voltage throughout a full cycle.

3.7 volt li-ions typically will safely charge to 4.2 volts, approximately. As soon as it is removed from the charger, the voltage will sag a very small amount. Perhaps 4.2 volts to 4.17 volts, for example. This is perfectly normal, and does not mean the cell isn't fully charged. In fact, if you see 4.2 volts at rest, that means the cell was at least slightly overcharged. Also remember that cell ratings are usually +- .05 volts, though, so a reading of 4.21 or 4.18 volts is fine too. No need to be worried about it unless the cell is out of that recommended range.

The voltage under load is dependent on how much current you are pulling from the cell, and what chemistry it uses. IMR, for example, can typically deliver higher currents, and sag less under load. The trade off is lower capacity. A cell charged to 4.2 volts might sag to 3.9 volts right away under a three amp load, for example, and then gradually decline as the cell is depleted.

You can use a cell immediately after charging. There is no need to wait for the voltage to settle.

You have heard from HKJ here, but this is one of his write-ups I have found particularly useful. Thanks HKJ!
 
Last edited:

Flashy808

Enlightened
Joined
Jul 2, 2015
Messages
282
Heh, no, you don't have it yet.

Nominal voltage has little or nothing to do with resting, or charging, or loading a cell. A typical li-ion works from 4.2 down to 2.5 volts or so - the nominal voltage is just an arbitrary value somewhere in that range. That's why you see some manufacturers list 3.6, and some list 3.7 volts as the "nominal" value. Neither is empirically more accurate...the nominal voltage is there only as a general guideline. I.e., you know not to put a 3.7 volt cell in a light that has a range of 0.9 to 2 volts, for example. But a (nominal) 3.7 volt cell can be 2.9 volts under load, or 4.1 volts at rest, or 3.1 volts at rest, or 3.8 volts under load. Note that this "nominal" voltage applies to NiMH, lithium, alkalines, and every other kind of cell, not just li-ion. None of them maintain exactly the same voltage throughout a full cycle.

3.7 volt li-ions typically will safely charge to 4.2 volts, approximately. As soon as it is removed from the charger, the voltage will sag a very small amount. Perhaps 4.2 volts to 4.17 volts, for example. This is perfectly normal, and does not mean the cell isn't fully charged. In fact, if you see 4.2 volts at rest, that means the cell was at least slightly overcharged. Also remember that cell ratings are usually +- .05 volts, though, so a reading of 4.21 or 4.18 volts is fine too. No need to be worried about it unless the cell is out of that recommended range.

The voltage under load is dependent on how much current you are pulling from the cell, and what chemistry it uses. IMR, for example, can typically deliver higher currents, and sag less under load. The trade off is lower capacity. A cell charged to 4.2 volts might sag to 3.9 volts right away under a three amp load, for example, and then gradually decline as the cell is depleted.

You can use a cell immediately after charging. There is no need to wait for the voltage to settle.

You have heard from HKJ here, but this is one of his write-ups I have found particularly useful. Thanks HKJ!

Hmm I think I am starting to get it now.
It's mainly the under load and nominal relationship that I didn't understand.

Thanks doc for helping! & of course everyone else :).
 

Phlogiston

Enlightened
Joined
Jan 7, 2015
Messages
601
Location
Scotland
As thedoc007 says, the most common types of Li-Ion cell can have an observed voltage of anything from 2.5V to 4.2V, depending on the cell's state of charge and how heavy the load is. Lower levels of charge and / or heavier loads reduce the observed voltage of the cell.

Note that most people prefer to keep their cells over 3V, to minimise the risk of accidental over-discharge and damage to the cell, especially given that not all cells are rated to go as low as 2.5V. A fair few people opt for higher voltages still.

As far as the nominal voltage goes, it might help if you think of it as an "average" voltage, seen during a manufacturer's capacity tests as the cell goes from fully charged to fully depleted.
 

Flashy808

Enlightened
Joined
Jul 2, 2015
Messages
282
As thedoc007 says, the most common types of Li-Ion cell can have an observed voltage of anything from 2.5V to 4.2V, depending on the cell's state of charge and how heavy the load is. Lower levels of charge and / or heavier loads reduce the observed voltage of the cell.

Note that most people prefer to keep their cells over 3V, to minimise the risk of accidental over-discharge and damage to the cell, especially given that not all cells are rated to go as low as 2.5V. A fair few people opt for higher voltages still.

As far as the nominal voltage goes, it might help if you think of it as an "average" voltage, seen during a manufacturer's capacity tests as the cell goes from fully charged to fully depleted.

Ah that sound just about reasonable. Thanks.

This may be a little off topic but:
Why is mAh the rating for battery capacity? Should it be a rating for current/flow as it is how many milliamperes an hour?
I don't understand I though milliamperes measures the amount of power.

Thanks for you comments.
 

Phlogiston

Enlightened
Joined
Jan 7, 2015
Messages
601
Location
Scotland
The reason battery ratings are in mAh is so that you can see how much current you can draw for how long. In theory, it tells you how much current you can draw if you want the battery to last one hour - current in "mA" for a period of one "h". Twice the current for half an hour, half the current for two hours, and so on.

You could just as easily rate the cell in mWh - power in "mW" for a period of one "h" - but historically, most electrical systems have been designed to work at one particular voltage and that's that. In that context, once you've put the right cell in, voltage becomes almost irrelevant, because you can always figure out runtime using just the current draw and the mAh rating. There's no need to go as far as thinking about power. It's also easier and cheaper to build a current meter than a power meter.

Of course, nowadays we have devices like flashlights which accept a wide range of input voltages and draw however much current it takes to maintain the correct power level. In that context, it can be easier to use mWh instead of mAh, because that automatically takes account of the voltage differences between 1.2V NiMH, 1.5V primary lithium and 3.7V Li-Ion cells, for example.

Most people don't need to bother with that, though. Nine times out of ten, all you need to do is pick the right cell type - 3.7V Li-Ion, say - and compare mAh ratings to see which cell will give you the most runtime.

On a related note, manufacturers have a nasty tendency to stretch their mAh ratings by testing at lower currents for longer periods, taking advantage of the fact that virtually all batteries are most efficient at low discharge currents relative to their capacity. It's quite unusual to find a cell that will actually supply its mAh rating over an hour.

For example, a 3000mAh 18650 cell may well be rated at a discharge current of 0.2C - 20% of its mAh-rated current. In other words, it would be discharging at 600mA for 5 hours to achieve its mAh rating. You'd be lucky to actually get 3000mA out of it for an hour, despite the fact that drawing 3000mA is perfectly normal and many devices do exactly that.

This means that you can normally use a cell's mAh rating to get a reasonable idea of the runtime your device might achieve, but it won't be exact, and you're well advised to assume that you'll actually get a bit less in real life.

The only way to be certain of your device's runtime on a given cell is to put the fully-charged cell in, run the device normally and see how long it takes for the cell to run out of energy.
 

Flashy808

Enlightened
Joined
Jul 2, 2015
Messages
282
The reason battery ratings are in mAh is so that you can see how much current you can draw for how long. In theory, it tells you how much current you can draw if you want the battery to last one hour - current in "mA" for a period of one "h". Twice the current for half an hour, half the current for two hours, and so on.

You could just as easily rate the cell in mWh - power in "mW" for a period of one "h" - but historically, most electrical systems have been designed to work at one particular voltage and that's that. In that context, once you've put the right cell in, voltage becomes almost irrelevant, because you can always figure out runtime using just the current draw and the mAh rating. There's no need to go as far as thinking about power. It's also easier and cheaper to build a current meter than a power meter.

Of course, nowadays we have devices like flashlights which accept a wide range of input voltages and draw however much current it takes to maintain the correct power level. In that context, it can be easier to use mWh instead of mAh, because that automatically takes account of the voltage differences between 1.2V NiMH, 1.5V primary lithium and 3.7V Li-Ion cells, for example.

Most people don't need to bother with that, though. Nine times out of ten, all you need to do is pick the right cell type - 3.7V Li-Ion, say - and compare mAh ratings to see which cell will give you the most runtime.

On a related note, manufacturers have a nasty tendency to stretch their mAh ratings by testing at lower currents for longer periods, taking advantage of the fact that virtually all batteries are most efficient at low discharge currents relative to their capacity. It's quite unusual to find a cell that will actually supply its mAh rating over an hour.

For example, a 3000mAh 18650 cell may well be rated at a discharge current of 0.2C - 20% of its mAh-rated current. In other words, it would be discharging at 600mA for 5 hours to achieve its mAh rating. You'd be lucky to actually get 3000mA out of it for an hour, despite the fact that drawing 3000mA is perfectly normal and many devices do exactly that.

This means that you can normally use a cell's mAh rating to get a reasonable idea of the runtime your device might achieve, but it won't be exact, and you're well advised to assume that you'll actually get a bit less in real life.

The only way to be certain of your device's runtime on a given cell is to put the fully-charged cell in, run the device normally and see how long it takes for the cell to run out of energy.

:duh2: :eek: Wow full marks for effort! :thumbsup:

I think I understand now. They use mAh to show how much can be drawn for how long so they use that to show how much is in the battery.

And is it because it is more convenient and common for manufacturers to use mAh instead of just mA right?

Also just to be sure milliamperes and amps are still the actual measure for how much power right?
Then how would you work out say how many mA an 3000mAh battery has?
-So basically how to convert from current to amount. BUT in the process of typing this I realise that it's not possible to know exactly how much mA a battery contains without getting your battery and draining it out over a period of time right at a rate right?? :shrug:

So I guess I'll just go by what I (and many) always go by: higher mAh=higher capacity, check manufacturers' voltage needs, don't worry about current too much!

& one last thing: just out of curiosity how might one go about converting mAh to mWh (I totally agree that it should be more common to have a mWh rating to take into account the voltages)?:twothumbs

Few that's a lot of questions :poof:.
Btw: Feel free to correct me where I'm wrong.
Thanks for Responding Phlogiston :thumbsup:
 

Phlogiston

Enlightened
Joined
Jan 7, 2015
Messages
601
Location
Scotland
Also just to be sure milliamperes and amps are still the actual measure for how much power right?

Milliamperes and amps are current. Power would be milliwatts and watts, which you get when you multiply voltage by current.

Then how would you work out say how many mA an 3000mAh battery has?

This question doesn't really have an answer. You can draw as many or as few mA from a cell as you like, as long as you stay below the cell's maximum current ratings. However, you can only have as many mAh as the cell can contain, and then you either have to recharge it or put in a new one. The mA you draw determine how quickly your mAh will be exhausted.

& one last thing: just out of curiosity how might one go about converting mAh to mWh (I totally agree that it should be more common to have a mWh rating to take into account the voltages)?

To get mWh, multiply mAh by the nominal voltage.

For example, a 3000mAh Li-Ion cell with a 3.7V nominal voltage contains 11100mWh.

Thanks for Responding Phlogiston :thumbsup:

You're welcome :)
 

Flashy808

Enlightened
Joined
Jul 2, 2015
Messages
282
Milliamperes and amps are current. Power would be milliwatts and watts, which you get when you multiply voltage by current.



This question doesn't really have an answer. You can draw as many or as few mA from a cell as you like, as long as you stay below the cell's maximum current ratings. However, you can only have as many mAh as the cell can contain, and then you either have to recharge it or put in a new one. The mA you draw determine how quickly your mAh will be exhausted.



To get mWh, multiply mAh by the nominal voltage.

For example, a 3000mAh Li-Ion cell with a 3.7V nominal voltage contains 11100mWh.



You're welcome :)

Very interesting information! Thank you helped a lot.
 

ganz-lite

Newly Enlightened
Joined
Jun 12, 2012
Messages
43
I'm finishing re-saving up to get a 2 18650-celled flashlight from JayRob (Thank you 2 months of undiagnosed pneumonia that culminated in a 105 fever and pneumonia in both lungs). The cells will be protected cells. If I'm not using the flashlight for extended periods, how do I store them to ensure that they stay happy. I'm assuming in a drawer of some sort (not in the flashlight nor the charger). I want happy cells.
 

xzel87

Enlightened
Joined
Nov 15, 2014
Messages
296
Location
Sabah, Malaysia
I'm finishing re-saving up to get a 2 18650-celled flashlight from JayRob (Thank you 2 months of undiagnosed pneumonia that culminated in a 105 fever and pneumonia in both lungs). The cells will be protected cells. If I'm not using the flashlight for extended periods, how do I store them to ensure that they stay happy. I'm assuming in a drawer of some sort (not in the flashlight nor the charger). I want happy cells.

Depends whether you are storing the light without the intention to use it during the storage period or storing as an emergency light. If you don't intend to use it, discharge the batteries till 3.7V then store in a cool dry place. Some say place it in an airtight ziploc bag and put in the fridge (some say freezer) but personally I wouldn't do this since you need to let the batteries come back to room temp before charging it.
 
Top