# LED heat vs INCAN heat



## tsask (Jun 16, 2008)

To explain one difference between the wasted energy and heat producing properties of INCAN lights compared to LED lights, I'd like to know how the heat from an LED is different, aside from the amount in the amount.


----------



## IMSabbel (Jun 16, 2008)

tsask said:


> To explain one difference between the wasted energy and heat producing properties of INCAN lights compared to LED lights, I'd like to know how the heat from an LED is different, aside from the amount in the amount.



The heat is basically the same.

The basic difference is the temperature of the object where the heating is happening:

The wire in an INCAN is already very very hot. So it radiates away everything it gets, being in thermodynamic equlibrium. What is usually called "heat" here is the part of the radiation thats not in the optical spectrum (75-95% of it).

With LEDs, the object that gets heated is a semiconductor. To work, it has to stay cool (in contrast to the INCAN, who works by getting HOT). So you cannot go the easy incan way of just letting it radiate the heat away.



Instead, you have to drain it, via heatsinks.


----------



## cdosrun (Jun 16, 2008)

Another difference is in the wavelength of emissions for 'felt heat'. Heat can be felt in the beam from a relatively small incand but only higher power LEDs 'feel hot' in a similar way. 

Here the difference is in the wavelength of light emitted by the source, incands emit a high percentage of their out as infrared heat (wasted energy) which penetrates the skin more readily than higher frequencies; if you press a light against your skin it appears deep red, higher frequencies are absored or reflected by the outer layers. As the nerve receptors for temperature lie under the epidermal layer of skin, longer wavelengths feel warmer than shorter ones; therefore it takes a much brighter LED light than incand to feel warm but the 'heat' you feel from an LED is just absorbed light energy.

Sorry if that wasn't an element of the question at all.

Theoretically an ideal LED would produce no heat at all if it was completely efficient yet it is required for an incand.

Andrew


----------



## tsask (Jun 16, 2008)

Thanks for the info, YES it all does apply to the question.
I did not know the different wavelegnths of radiant heat at all!


----------



## IMSabbel (Jun 16, 2008)

cdosrun said:


> Another difference is in the wavelength of emissions for 'felt heat'. Heat can be felt in the beam from a relatively small incand but only higher power LEDs 'feel hot' in a similar way.
> 
> Here the difference is in the wavelength of light emitted by the source, incands emit a high percentage of their out as infrared heat (wasted energy) which penetrates the skin more readily than higher frequencies; if you press a light against your skin it appears deep red, higher frequencies are absored or reflected by the outer layers. As the nerve receptors for temperature lie under the epidermal layer of skin, longer wavelengths feel warmer than shorter ones; therefore it takes a much brighter LED light than incand to feel warm but the 'heat' you feel from an LED is just absorbed light energy.
> 
> Andrew



Actually, that explanation is not true. 
Its a common fallacy in science teaching.
Visible light penetrates nearly as well as infrared (at least in the range that matters, that is the epidermis). The difference is there, but not huge.

The main difference is that an incand will readiate a factor of 5 more watt of radiation than a LED of equal brightness. Add to this the fact that most Incans are brighter than most LEDs (in terms of lumens), and you easily get an order of magnitute difference from it.


----------



## frenzee (Jun 16, 2008)

The main difference is that in an incan all heat is dissipated via infrared from the front, whereas in an LED all heat is (or should be) transported away from the slug in the back to the heat sink. Visible range LEDs don't emit in the infrared region. There is a good video on this subject on National's website:

http://www.national.com/nationaltv/abdshow16/index.html


----------



## cdosrun (Jun 16, 2008)

IMSabbel said:


> Actually, that explanation is not true.
> Its a common fallacy in science teaching.
> Visible light penetrates nearly as well as infrared (at least in the range that matters, that is the epidermis). The difference is there, but not huge.
> 
> The main difference is that an incand will readiate a factor of 5 more watt of radiation than a LED of equal brightness. Add to this the fact that most Incans are brighter than most LEDs (in terms of lumens), and you easily get an order of magnitute difference from it.



I agree, the depth of penetration or both vis and IR wavelengths through epidermal tissue is very much the same but temperature perception usually occurs at thicker tissue depths into the dermis. My practical experience is more on a DNA level and in the lab the peak absorption of DNA/RNA is 260/280nm which limit UV penetration depth, in the visible range the absorption of bound haemoglobin comes into play, with greater absorption at shorter wavelengths (blood is red) and at far IR ranges, water affects absorbance.

From this, it would appear as though the blood travelling in the dermis above the nervous tissue absorbs a higher percentage of shorter wavelength light and transports the energy away from the deeper layers. Longer wavelengths of light, into the near-infrared range would be absorbed less by the blood and affect the underlying nervous tissue to a higher degree. I am afraid I don't have access to scientific journals any more but this link to a German regulatory paper on the subject of IR risks to skin shows a graphs of skin penetration depth with respect to wavelength http://www.icnirp.de/documents/infrared.pdf on page 6. I can't comment as to the scientific nature of the document but I think it is persuasive.

I do agree with you over incand power, in order to compare to LEDs in visible emissions, they have to emit far more energy (inefficient vis emitters). My own experiments (purely for my own amusement) show that I can feel more heat being emitted by a MiniMag solitaire than my DBS on low, it is certainly not conclusive but I think it is slightly persuasive to the fact that visible light doesn't feel as warm as IR.

Anyway, I am really not trying to be inflammatory, just interested; sorry for taking the thread somewhat off topic too.

Andrew


----------



## haserman (Jul 2, 2012)

here's a bit more on heating the body (which is what i want to do) that i got from an expert in the field. it is a bit off-topic from the original question, but should serve to further illuminate (haha) the physiological effects of infrared radiation.

[heating the body] is constrained by thermal diffusion and convective cooling via blood flow, which usually has a time constant on the order of 1-3 s. People usually avoid exceeding 100 mW/cm2 light if they wish to limit skin heating to only about 1-2 degrees C.

If your goal was to maximize whole body heating, you might consider a wavelength range ( eg., 500-600 nm). Using this wavelength range optimizes hemoglobin absorption. The skin has a network of blood vessels (the superficial venous plexus) at a depth of about 100-250 um. Using 500-600 nm light would cause the heating of blood within this superficial venous plexus, and let the blood circulation carry the heat to the body. 

 If your goal was to maximize skin temperature, for the maximum sensation of heating, then the 700-800 nm light might be best.
 I would not worry about efficiency of heating, but rather would consider optimizing the depth of light penetration into skin so as to avoid superficial overheating.
Hence, I would consider working with about 700-800 nm light where there is an optimum light penetration.

 The issue of melanin adds another factor to the problem. If you are dealing with pigmented individuals, then the shorter wavelnegths (500-600) are problematic. Moving to the invisible near-infrared spectrum is better. The range of 800-900 would reduce the melanin absorption. 





cdosrun said:


> I agree, the depth of penetration or both vis and IR wavelengths through epidermal tissue is very much the same but temperature perception usually occurs at thicker tissue depths into the dermis. My practical experience is more on a DNA level and in the lab the peak absorption of DNA/RNA is 260/280nm which limit UV penetration depth, in the visible range the absorption of bound haemoglobin comes into play, with greater absorption at shorter wavelengths (blood is red) and at far IR ranges, water affects absorbance.
> 
> From this, it would appear as though the blood travelling in the dermis above the nervous tissue absorbs a higher percentage of shorter wavelength light and transports the energy away from the deeper layers. Longer wavelengths of light, into the near-infrared range would be absorbed less by the blood and affect the underlying nervous tissue to a higher degree. I am afraid I don't have access to scientific journals any more but this link to a German regulatory paper on the subject of IR risks to skin shows a graphs of skin penetration depth with respect to wavelength http://www.icnirp.de/documents/infrared.pdf on page 6. I can't comment as to the scientific nature of the document but I think it is persuasive.
> 
> ...


----------



## alpg88 (Jul 2, 2012)

it isn't important what kind of heat\wavelenght, for inc. and led.
one important thing, leds need to have heat moved away for them to work right, leds need "cooloing", bulbs emmit heat, bulbs don't really care if you drive heat away, or simply contain it. 
hot led=bad thing.
hot bulb=not bad thing.


----------



## bshanahan14rulz (Jul 2, 2012)

Wavelength is one of the factors that determines how the energy is absorbed or reflected by the target, so indeed the wavelength is important.


----------



## haserman (Jul 2, 2012)

to clarify, LEDs need to operate cool. if they are high power LEDs, this means that you'll need to attach a heatsink (a piece of metal, thermally bonded) to the LED. if it gets too hot, it will burn out. 

in contrast, an incandescent light is _designed_ to operate hot. it does not need a heatsink. note, though, that recessed lights that take incans are typically designed to take parabolic reflector bulbs. a regular incan will have more heat going upwards (because there is no reflector in the bulb) than the can is rated for. in general, this does not create a problem, but it is good to recognize that there are _always_ limits on heat produced.



alpg88 said:


> it isn't important what kind of heat\wavelenght, for inc. and led.
> one important thing, leds need to have heat moved away for them to work right, leds need "cooloing", bulbs emmit heat, bulbs don't really care if you drive heat away, or simply contain it.
> hot led=bad thing.
> hot bulb=not bad thing.


----------



## Hoop (Jul 3, 2012)

cdosrun said:


> haemoglobin



OH NO HE DIDN'T!


----------



## alpg88 (Jul 3, 2012)

bshanahan14rulz said:


> Wavelength is one of the factors that determines how the energy is *absorbed or reflected by the target,* so indeed the wavelength is important.



uh, what??


----------



## haserman (Jul 3, 2012)

cdosrun said:


> As the nerve receptors for temperature lie under the epidermal layer of skin, longer wavelengths feel warmer than shorter ones; therefore it takes a much brighter LED light than incand to feel warm but the 'heat' you feel from an LED is just absorbed light energy.
> Andrew



not exactly. different wavelengths have different absorption and reflection. this is NOT a linear function of wavelength, i.e. there are peaks at certain wavelengths. this depends on the absorption by the epidermis, the dermis, fat, water and blood. it appears to be a fairly complex subject.

all absorbed radiation, regardless of wavelength, will cause the temperature to rise. radiation that goes deeper into the body may not be felt as heat as there are no thermoreceptors in that area.


----------



## haserman (Jul 3, 2012)

yes, *bshanahan14rulz*is correct. see my replies elsewhere in this thread.



alpg88 said:


> uh, what??


----------



## EricB (Jul 3, 2012)

So the purpose of the heatsink is to disperse _excess_ energy from 120v, right? If there was less current/voltage (as in battery power), then it wouldn't be needed, and the LED would be completely cool, right?

Heat sinks are now usually on the base (to the point that only the top dome of the bulb glows), and in some bulbs are quite bulky. I imagine if LED ever became universal, then the source voltage for lighting could be dropped? (But then there's all the other electric appliances in the house).


----------



## alpg88 (Jul 3, 2012)

haserman said:


> yes, *bshanahan14rulz*is correct. see my replies elsewhere in this thread.


your replies ,as you said yourself are off topic, it might be important in medical field or some other sci. field.
but as far as light manufacturing, and [email protected], and operation, all that is maters where the heat is, where you want it to be, temp in C* or F*, and amount in btu or kal.or whatever else they mesure amount of heat. also thermal conductance, and resistance of the materials.


----------



## A10K (Jul 3, 2012)

No. Reference a flashlight that runs at 3A from 3.7V battery. It gets hot, very hot, even when there is little power conversion going on. While the "excess" matters, that has nothing to do with power dissipated by the actual LED. As IMS pointed out earlier, LEDs are usually much more efficient than incans of equivalent output, but are still far from ideal; a quick check of the Wikipedia page shows that about 70% of the power from an LED goes into heat, not light (versus over 90% for an incan) The difference is that this excess heat is rarely self-regulating, and needs a way to be transported from the actual LED chip.


----------



## Christexan (Jul 3, 2012)

I've been corrected on this in the past about these answers being too simplistic, and feel a need therefore to post up. 
All the energy used in either an LED or incandescent ends up as heat at some point in the process. The more accurate question I think is "what is felt immediately as heat emitted directly from the source". Most of the answers above relate to that, basically, you really feel the infrared directly more immediately than other wavelengths.
There are many other factors though in this question. Even UV emissions eventually result in heat, infrared feels hot "immediately" due to molecular reactions with the skin cells etc, but UV knocks electrons around their orbits, and as those settle down the result is felt as heat. (Think a sunburn... when in the sun, you are "hot" immediately (step from shade to sun).. what you feel at that moment is infrared... stay out for 60 minutes at noon and go inside, and a few hours later your skin still feels "hot"... the infrared effects are long gone, what you are feeling now is the resultant heat from the UV energy that was absorbed at the atomic level as the atoms settle back down. 
Here is a link I found awhile back that covers it better than I can:
http://hyperphysics.phy-astr.gsu.edu/hbase/mod3.html
All wavelengths/photons interact at some point with other objects, and generally result in eventual heat production. 
The other part missing really, is the efficiencies, it's been touched on, but if you want the visible light output of a 5-watt incan bulb (for arguments sake at 15 lumens/watt = 75 lumens), you only need a fraction of a watt equivalent LED (75lm/100lm/watt = 0.75 watts) = . So the LED really DOES generate less heat (less power input = less total output) for a given lumens output. 
At both = 5 watts input, in a closed system (not looking at electrical losses elsewhere, etc, assuming the direct-to-source is 5-watts), both will be generating the same output energy, which at some point is going to stop and result in heat production. The LED emits a larger proportion in the optical wavelengths that we desire from them, but ultimately, it all converts into heat.
Light is just a form of energy, and when energy is absorbed by impacting a substance, the result is heat production (or given enough energy, atomic changes to modify the underlying matter, but otherwise all of it eventually results in heat.) Ultimately at the end of enough time, the universe will just be black and full of a bunch of jittering particles scattered throughout space, too far apart to interact, too cool to radiate any energy in the form of light radiation, etc. Not to get too existential on the subject, LOL.


----------



## haserman (Jul 3, 2012)

EricB, it is probably a bit of both. the LED will generate a certain amount of heat as a byproduct of the light it generates. the changing of the source voltage to something appropriate for the LED probably also contributes some of the heat. My guess is that the former is the large part of the waste. I am _not_​ an expert on this, though.


----------

