# Mixing LED color bins to get high CRI?



## jasonck08 (Mar 4, 2015)

A while ago I remember I grabbed 3x Xeno E03's, a warm white, a cool and a neutral white. I turned them all onto high mode and grouped them together in my hand. And I was surprised at how nice the 3 different color temperatures of light blended together and that colors appeared quite vibrant.

So my question is this: Does mixing 3 different color temperatures of LED's increase the CRI? For example the CW was probably ~70 CRI, NW ~75 and WW around ~80 CRI, and color temps would have been ~3250K, 4750K and 6500K. Would these 3 blended LED's lead to something high CRI of say 80-90+? If so, this might be a good alternative to a single color temperature high CRI LED (generally not very efficient).

Thoughts?


----------



## more_vampires (Mar 4, 2015)

http://www.candlepowerforums.com/vb/showthread.php?396445-Light-Design

http://www.candlepowerforums.com/vb...04-Variable-Color-Temperature-4xAA-Flashlight



> Does mixing 3 different color temperatures of LED's increase the CRI?



Apparently, it CAN.

Something else to consider is that perhaps CRI isn't all it's cracked up to be. A couple months ago, I stumbled across a slew of articles all at once claiming that CRI is fundamentally broken. I even encountered a write up of light with *negative CRI.* My initial response was similar to "WTH is this??!?!"

http://en.wikipedia.org/wiki/Color_rendering_index


> Numerically, the highest possible CRI is 100, for a Black body (incandescent lamps are effectively blackbodies), dropping to negative values for some light sources. Low-pressure sodium lighting has negative CRI



http://lumenistics.com/what-is-color-rendering-index-cri/


> *CRI has also been found to be an inaccurate, unreliable predictor of color preference* of solid-state lighting products such as light-emitting diodes (LEDs), which emit a much different light than fluorescent or HID lamps, and can result in lower *or even negative CRI values for some of them.* For instance, some LED products with a CRI as low as 25 can produce white light that actually make object colors appear more vivid.





> CQS Debuts: To help remedy these drawbacks, CIE established a technical committee in 2006 to develop and recommend a new color rendering metric.In June 2010, Wendy Davis, chair of the committee, debuted the Color Quality Scale (CQS), which she developed with colleagues at the National Institute of Standards and Technology (NIST), the federal technology agency that works with industry to develop and apply technology, measurements, and standards.



I've also talked with a couple of veteran flashlight modders who also feel that CRI is irrelevant to color rendition, so it's not just technical articles talking. Strange that the CRI rating seems not that great for describing LED light, I wouldn't have guessed that.


----------



## a1mu1e (Mar 4, 2015)

I've read that also. As I understand it, CRI refers to how well, on average, a blackbody curve is followed (regardless of color temperature). If there are small breaks in color, but they are evenly distributed, CRI is artificially high.


Sent from my iPhone using Candlepowerforums


----------



## more_vampires (Mar 4, 2015)

What I keep hearing about color rendition is to ignore the ratings and numbers and try it.

I have a pile of various colored objects; I turn on one or more lights and flick one on the pile, then flick the other on the pile... back and forth.

Strange that some colors pop and some don't and it varies by individual emitter. You can't really go by manufacturer claims when it comes to actual color rendition as your eyes see them.

Playing with some toy IR goggles, I found it strange that all black objects are not black. You can see a greenish or other hue within the object when viewed with infrared gear. This meshes with what I'd heard in science class all those years ago, that the vast majority of black objects aren't really black. It's a very, very dark shade of some other color.


----------



## Anders Hoveland (Mar 4, 2015)

jasonck08 said:


> Mixing LED color bins to get high CRI?


This _used to_ be a strategy used by some D.I.Y. hobbyists. The old white LEDs (and still many cheap 70 CRI Chinese LEDs) used a single phosphor formulation. Depending on the degree of doping in the Ce:YAG microcrystalline structure, the phosphor emission could be more orange-shifted or more green-shifted, but the wavelength distribution was still mostly centered around a narrow range of wavelengths.

Most modern white LEDs use a formulation that combines at least two different types of phosphor, for wider coverage across the spectrum and better CRI. The coverage may still not be great, but it is still better than a single phosphor.
So any potential benefit of combining two different color temperatures of LED are much less when the CRI of each initial LED is already over 80.

Most of these white LEDs around 80 CRI do not actually use a green and red phosphor; more like a yellowish green phosphor and an orange phosphor (both Ce:YAG based). While this results in higher lumen output, it is not the most ideal from the standpoint of CRI. Red phosphors generally tend to have very low efficiency, both because the human eye has a lower sensitivity to longer red wavelengths, and because the red-shifted phosphor emits so much of it's energy in the near-infrared. (There are several strategies to get around these problems, but that is a separate discussion). An orange-centered Ce:YAG phosphor still emits a fair amount of longer red wavelengths, so the color rendering is not so terrible.




jasonck08 said:


> Mixing LED color bins to get high CRI?
> So my question is this: Does mixing 3 different color temperatures of LED's increase the CRI?


The problem with mixing the light from a high color temperature source with that of a lower color temperature source is that the overall combined light ends up a little magenta-tinted. Remember, as we follow the black body curve on a color coordinate graph, the proportion of blue increases faster than the proportion of green. 

There is really no point in using more than 2 different color temperatures. Even with an orange-centered phosphor and a green-centered phosphor, there is still going to be a huge amount of overlap in the yellow wavelength range, in fact it will still have more yellowish-green wavelengths than an ideal blackbody emission should have. So there is no point adding a third color temperature with a phosphor emission centered in the middle.

I researched the spectral graphs, and even if you combined a standard 5000K LED and standard 2700K LED, a high CRI LED (~94 cri) would _still_ provide better coverage across the spectrum, both in the deep red and cyan. 
Also, a single-phosphor LED does not use a truly green-centered phosphor unless the color temperature is _extremely_ high (~10000K), and that would certainly add much more blue light than you want. I bought some special (cheap chinese-made) "cold green" color LEDs, though, which use phosphor as greenish-shifted as Ce:YAG phosphor can go.


----------



## Anders Hoveland (Mar 5, 2015)

more_vampires said:


> I've also talked with a couple of veteran flashlight modders who also feel that CRI is irrelevant to color rendition, so it's not just technical articles talking.


There is certainly some truth to that, but it is not completely irrelevant. CRI is a good general indicator of color rendering ability, but not a perfect or exact one. An LED with 90 cri is going to be better than another one with 80 cri, but when it comes to comparing a 92 cri LED to another one rated at 94, for example, things can be less clear.


----------



## SemiMan (Mar 7, 2015)

That's not true either. Try rendering blues and many greens with 100CRI 2700k ... Does not work well. A balanced 85CRI 4000k will be way better. Gamut is important.

Posted by really crappy Tapatalk app that is questionable wrt respect of personal data.


----------



## Anders Hoveland (Mar 7, 2015)

SemiMan said:


> Try rendering blues and many greens with 100CRI 2700k ... Does not work well. A balanced 85CRI 4000k will be way better. Gamut is important.


This is getting off-topic, but there is some small amount of truth to this. I have noticed that under a 2700K (long-life) incandescent bulb, red color rendering seems inferior to a 3000K halogen. Why would that be? Incandescent light has PLENTY of deep red wavelengths. The problem, I believe is color contrast. Yes, sure, a red colored object is reflecting mostly red colored light, but the white background behind it is so orange-tinted because of the low color temperature that there is really not as good contrast. Red roses under 2700K incandescent light do look just a little orange-tinted, not as saturated red. But at 2850K it is really not noticeable to me. Everything looks just fine to me at 2900K, everything seems perfectly colorful. The only minor exception might be deep royal blues and indigo.

But when we are dealing with LED light, which generally does not have the best red saturation to begin with, contrast effects may be even more important. I have noticed that reds seem to look more saturated under my "daylight" color temperature LEDs than the "warm white" ones. (saturation is still a different thing from brightness of a color though)


----------



## SemiMan (Mar 7, 2015)

Small amount of truth? There is a huge amount of truth. There are even now high gamut lights with sub-100 CRI on purpose to provide a light that provides rendering across a wider range of color. Halogen = hot poker ... hardly the last word in lighting natural black body sources bright enough to be called "lighting" don't even exist in the 2700-3000K range so why even try to replicate?


----------



## idleprocess (Mar 8, 2015)

I do enjoy reading Anders' description of test results as reported by his highly-calibrated eyeballs.


----------



## SemiMan (Mar 8, 2015)

Me too, but it's always "off-topic" and he never "has the time" to really get into it


----------



## JoakimFlorence (Oct 5, 2022)

I can see no reason why mixing different color temperatures of LED would result in higher CRI.
Unless if you are using a higher CRI lower color temperature LED and combining it with a lower CRI higher color temperature LED.
In that case, all that red-orange wavelength light could greatly help - Both because a low color temperature spectrum has a lot more of it, and because the red wavelength does not have to be as deep at higher color temperatures to give high CRI.

(I still doubt you would be able to attain a higher CRI than the CRI of the highest CRI LED you are using, though it might appear better simply because colors appear more saturated above 2700K, since 2700K is so low it can orange-wash the appearance of colors. If you want to view it this way, this is pretty much just a way to raise the color temperature of a high CRI LED)

Also keep in mind that by mixing lower and higher color temperature LEDs together, it can result in a slightly magenta tint. This is because the color temperature coordinate curve is a curve, not a linear line.


----------

