How much heat do LEDs make per watt?

Number21

Newly Enlightened
Joined
Aug 25, 2012
Messages
48
I'm trying to find out how much heat a high power LED puts into it's heat sink. Say I have a low cost 100w Chinese LED chip that puts out 90-100 lumens/watt. How much of that 100 watts is coming out directly as heat energy?

I'm working on a project that I hope to use ten of the 100w COB LED chips mounted to an aluminum plate. On the back of the plate would be a coil of copper tubing with water being pumped through it. How many BTUs or watts of total heat output would I be looking at here?
 
Last edited:

parametrek

Enlightened
Joined
Apr 3, 2013
Messages
578
They are around 20%-30% efficient. I always spec out my coolers for 0% efficiency though. This provides some margin for normal use and will also prevent the LED from overheating in a worst case scenario where the light is set face down on a surface. The surface may still catch fire however. :)
 

mattheww50

Flashlight Enthusiast
Joined
Jun 24, 2003
Messages
1,048
Location
SW Pennsylvania
At 100 lumens per watt the efficiency is about 20%, so your 100 watt LED chip is going to generate about 80 watts as heat. A watt is 3.41 BTU's per hour, so you are looking at about 273 BTU's per hour for each 100 watt LED device.
 

ssanasisredna

Enlightened
Joined
Oct 19, 2016
Messages
457
This question has been answered numerous times on cpf and can also be found readily on the web. I think a search of cpf on this topic will give you answers more accurate than the posts that have already been made in this thread. 100 lumens per watt is much closer to 30% than 20.
 

Number21

Newly Enlightened
Joined
Aug 25, 2012
Messages
48
This question has been answered numerous times on cpf and can also be found readily on the web. I think a search of cpf on this topic will give you answers more accurate than the posts that have already been made in this thread. 100 lumens per watt is much closer to 30% than 20.

Yes, I searched before I asked and found about 30 different answers to the same question, many of them disagreeing with each other. That's why I asked.

I'd hazard a guess that most LED questions have already been asked and answered at this point, so I guess you should just close the forum down, huh?
 

ssanasisredna

Enlightened
Joined
Oct 19, 2016
Messages
457
Yes, I searched before I asked and found about 30 different answers to the same question, many of them disagreeing with each other. That's why I asked.

I'd hazard a guess that most LED questions have already been asked and answered at this point, so I guess you should just close the forum down, huh?

If there are 30 disagreeing answers why do you think the next one will be correct? The last 2 in this thread were not.
 

Number21

Newly Enlightened
Joined
Aug 25, 2012
Messages
48
If there are 30 disagreeing answers why do you think the next one will be correct? The last 2 in this thread were not.

I think the better question is why do you feel the need to argue about a question that has already been answered and not offer any useful advice yourself?
Please just save us all some time and don't bother replying here anymore.

I appreciate those who have offered useful information.
 

DIWdiver

Flashlight Enthusiast
Joined
Jan 27, 2010
Messages
2,725
Location
Connecticut, USA
You can make a reasonable estimate of how much input electrical power is converted to light. The rest is converted to heat.

The maximum theoretical efficacy is dependent on spectrum. This is because a lumen is a unit of perceived brightness by the human eye, and a watt of green light looks brighter than a watt of red or blue light. So if they converted 100% of input power to light, a green LED would look brightest, yellow next, and red and blue the dimmest, all other things being equal. The exact relationships would depend on the exact wavelengths chosen. At the far ends of human perception, the transitions to ultraviolet or infrared, a watt of light would have zero lumens.

Since "white" light is a mix of various colors, you'd expect white to have somewhere between the maximum (683 lm/W at 555 nm) and the min, zero. Most white LEDs have a spectrum that translates to 300-340 lm/W, with those being heavy in greens and yellows being at the higher end, and those heavy in red and/or blue being at the low end. Note that this is entirely due to the spectrum of the LED, not the quality, how it's pumped, how hard it's driven, etc. (except as those things impact the spectrum).

So if you have a neutral white LED with decent CRI, it would be reasonable to assume you are getting a spectrum with around 320 lm/W. If this LED is emitting only 100lm/W, then it is converting
100lm/W / 320lm/W * 100 = 31% of the input power to light. The remainder, 69%, is heat.

To me this is easier to grasp the other way around. If your LED is converting 31% of 1 watt of input power into light, that's 0.31W of light output. If you multiply this by 320 lm/W, you get 100 lm!

Keep in mind that other parts of your system will absorb some of the light, converting it to heat and reducing the overall system efficacy. It's not hard for lenses, reflectors, filters, etc to absorb 20% or more of your emitter's output. You'd have to judge whether this would impact your heatsink calculations.
 

Number21

Newly Enlightened
Joined
Aug 25, 2012
Messages
48
Thanks for the detailed reply. :thumbsup:


Another note to ssanasisredna - search isn't even enabled on this forum. *sigh*
 

Number21

Newly Enlightened
Joined
Aug 25, 2012
Messages
48
Have you heard of google?

Either be useful or be gone. Telling somebody to search is never a useful answer and it's usually rude. I've already asked you once, please do not post here anymore. Your comments are not helpful to anybody.
 

Latest posts

Top