# CRI / tint



## Brasso (Aug 26, 2011)

I love the high CRI xpg. Very soothing tint. Easy on the eyes.

On the contrary, I find the P4's a puke shade of blue.

To each their own.


----------



## kaichu dento (Aug 26, 2011)

*Re: HDS Systems EDC # 15*



GaAslamp said:


> Same here.  In retrospect, how silly it was of me to hope that the new high-CRI emitter would be a 5000K 92 CRI Nichia or something equally compelling.  I would have bought such a Clicky without hesitation, but since it seems that they've actually regressed instead...oh well.... :shrug:


This is one of the most stupid statements made in this thread over this issue. You're assuming that what you like is progress, and what you don't like is a step backwards. Cold tints have always been available from LED's and now that warm and neutral tint fans are getting something to get excited about you want to consider it regression. 

No ones preferences are important enough to negate the validity of other peoples preferences.


----------



## GaAslamp (Aug 26, 2011)

*Re: HDS Systems EDC # 15*



kaichu dento said:


> This is one of the most stupid statements made in this thread over this issue.



I'd like to humbly thank you for taking the high road and going easy on me. :bow: I realize that what you must be thinking is far, far worse, but in the interest of respectful discourse you've shown remarkable, nay, extraordinary restraint, and this, good people, is a shining example of why lovecpf:kiss:.



kaichu dento said:


> You're assuming that what you like is progress, and what you don't like is a step backwards.



No, I just gave my own point of view on the subject, much like everybody else does around here. To me the change HDS made is a step backwards, away from natural lighting and toward artificial lighting similar to what we've had for more than a century, although I understand that others have different points of view. You assumed that I consider my way the only "right" way, but you're wrong--not stupid, mind you , just plain wrong in this instance.



kaichu dento said:


> Cold tints have always been available from LED's



That's true enough, but not cool white LEDs that render colors with a high degree of accuracy. Such LEDs exist, I believe (I'd have to use them myself to confirm), but I've yet to see one used in a flashlight.



kaichu dento said:


> and now that warm and neutral tint fans are getting something to get excited about



As regrettably uncommon as these tints still are, given how much people like them, they've been around and there are more high-CRI versions on the way from several manufacturers. It's nothing we haven't seen before with incandescent lighting (which incidentally has a higher CRI), and I was hoping for something new, which I would view as progress.



kaichu dento said:


> you want to consider it regression.



This may seem a bit nitpicky, but it's not that I WANT to consider it a regression, it's just that in some ways I DO consider it a regression, and I have every right to hold and express my point of view.



kaichu dento said:


> No ones preferences are important enough to negate the validity of other peoples preferences.



Did I ever say otherwise? :thinking: That's very imaginative--I'm impressed!


----------



## the.Mtn.Man (Aug 27, 2011)

*Re: HDS Systems EDC # 15*

Were people really expecting the high CRI emitter to _not_ have a warm tint?


----------



## Girryn (Aug 27, 2011)

*Re: HDS Systems EDC # 15*



the.Mtn.Man said:


> Were people really expecting the high CRI emitter to _not_ have a warm tint?


 
wanted 4000k-5000k


----------



## TyJo (Aug 27, 2011)

*Re: HDS Systems EDC # 15*



the.Mtn.Man said:


> Were people really expecting the high CRI emitter to _not_ have a warm tint?


I was thinking the same thing. This disappointment and claiming the XPG HighCRI is a regression of some sort makes no sense. The HighCRI XPG is one of the most efficient High CRI emitters that is available currently. HighCRI lights tend to be warm, this is normal, not the exception. The Nichia emitter, which is more neutral supposedly, hasn't been proven reliable and we know Henry doesn't use an emitter unless it has a proven track record. I tried to find information about the HighCRI Nichia emitter but didn't have much luck (specifically with CRI rating and efficiency). I doubt high CRI alternatives to the XPG can match the XPG availability, efficiency, etc.


----------



## JWRitchie76 (Aug 28, 2011)

*Re: HDS Systems EDC # 15*



Girryn said:


> wanted 4000k-5000k



All high CRI's are significantly warmer than that. Your specs fall into the neutral category and we ain't getting those.


----------



## calipsoii (Aug 28, 2011)

*Re: HDS Systems EDC # 15*



JWRitchie76 said:


> All high CRI's are significantly warmer than that. Your specs fall into the neutral category and we ain't getting those.


 
Not all, most. The aforementioned Nichia 183A's fall firmly in the cool end of neutral. That said, I can't say I'm shocked by it being an XP-G either. I assumed that was a given the moment Henry announced the cool whites would be using XP-G's. All the other emitters other than the Nichia 119 have such a different footprint and dome design that he'd need a different reflector to accommodate them.


----------



## GaAslamp (Aug 28, 2011)

*Re: HDS Systems EDC # 15*



the.Mtn.Man said:


> Were people really expecting the high CRI emitter to _not_ have a warm tint?



Since the emitter is a Cree, then no, because they're not much into high CRI, and it's easier to achieve that spec with warm white LEDs. I was hoping for a different emitter from a different manufacturer. In the big picture, I didn't *expect* the flashlight to have a cooler tint, since that is not the (niche) market trend I've been seeing with high-CRI flashlights, but I had *hoped* that HDS would go against the trend and give us a new alternative to the other warm high-CRI flashlights that are going to be released in the near future.

Previously I had shied away from the SSC P4-based high-CRI HDS flashlights because I wasn't satisfied with certain aspects of the emitter. I did, however, take notice of its neutral tint, as opposed to warm, which is what got my hopes up for an improved neutral white emitter or even a cool white high-CRI emitter. But it didn't happen, boo-hoo, so I still can't justify (speaking only for myself, which should be implicit for what everybody says) paying that much for a flashlight that in my eyes can't match even my ZebraLight H51c in terms of color rendering. I really like the HDS EDC design and interface, and I wish that the new version would have met my requirements regarding beam tint. No big deal, though, as we all can't have everything we want in life--it's just a statement that I had hoped for something different, and I'm not the only one. It's just for general knowledge, but maybe somebody will listen and bring something new to the market.



Girryn said:


> wanted 4000k-5000k



That would have been fine, but I was hoping for 5000K-6000K, since such high-CRI LEDs actually exist, even today.



TyJo said:


> I was thinking the same thing. This disappointment and claiming the XPG HighCRI is a regression of some sort makes no sense.



It's a regression from the point of view of those who wish for flashlights (as well as other light sources) to emulate daylight as closely as possible. Those who prefer alternatives to the light that we primarily evolved under are perfectly within their right to do so, of course (if somebody prefers deep purple light, that's OK too), but the preference for daylight tint and color rendering does make some sense (to me, everything else makes colors look wrong).



TyJo said:


> The HighCRI XPG is one of the most efficient High CRI emitters that is available currently. HighCRI lights tend to be warm, this is normal, not the exception.



Yes, it's normal for LEDs, and it's going to become more common in the near future, but there is nothing wrong with being exceptional, either (as long as there is a market for it, anyway).



TyJo said:


> The Nichia emitter, which is more neutral supposedly, hasn't been proven reliable and we know Henry doesn't use an emitter unless it has a proven track record. I tried to find information about the HighCRI Nichia emitter but didn't have much luck (specifically with CRI rating and efficiency). I doubt high CRI alternatives to the XPG can match the XPG availability, efficiency, etc.



Admittedly, it's hard to find data for some of the LEDs I have in mind, but here is one that I might be interested in:

http://www.nichia.co.jp/specification/en/product/led/NS6W183A-H1-E.pdf

It is multi-die, which is not ideal for all purposes, but other flashlights have used such LEDs successfully. Its efficiency at 350 mA is not bad (slightly above that of the high-CRI warm XP-G), although its maximum drive current is much lower (should be sufficient for the HDS EDC).

Here is another that looks interesting (specifically the LXW8-PW50):

http://www.philipslumileds.com/pdfs/DS61.pdf

This one has a single die, and while 85 CRI may not seem very high to some, due to the peculiarities of white phosphor-based LEDs (which I can explain if anybody cares), the actual accuracy of color rendering is understated by the CRIs of higher-CCT LEDs. In other words, an LED can have more accurate color rendering (relative to sunlight) and a more balanced spectrum than another even when its CRI is lower.



JWRitchie76 said:


> All high CRI's are significantly warmer than that.



Not all, as my example above (Nichia NS6W183A-H1, 92 CRI @ 5000K CCT) shows, and CRI is rather flawed as an indicator of color rendering accuracy in any case.


----------



## Harry999 (Aug 28, 2011)

*Re: HDS Systems EDC # 15*



GaAslamp said:


> This one has a single die, and while 85 CRI may not seem very high to some, due to the peculiarities of white phosphor-based LEDs (which I can explain if anybody cares), the actual accuracy of color rendering is understated by the CRIs of higher-CCT LEDs. In other words, an LED can have more accurate color rendering (relative to sunlight) and a more balanced spectrum than another even when its CRI is lower.



I am interested. It would help explain why I find all my Zebralight 'c' models to be so satisfying to use. Thanks!


----------



## Bass (Aug 28, 2011)

*Re: HDS Systems EDC # 15*

No surprise that the CRI emitter is an XP-G, it was never likely to be anything else. 

I'm not personally a big fan of the CRI XP-G, it's too warm and makes everything look a muddy brown but this is just my personal opinion. Many people will like it and it has one big plus in that you can drive it much harder than all the other High CRI emitters (1.5 amps) so the output is higher (mute point in the HDS lights as they aren't driven that hard). In lights that do (ArmyTek) it has a real advantage over the other High CRI LED's.

I'm with GaAslamp, it would be nice to see more lights using the Nichias and Luxeon Rebel High CRI emitters. I'm tempted to get a Zebralight just to try the Rebel emitter. These Rebels have great potential in 'mainstream' lights.

As for the Nichia 183A-H1, it looks even better. With a 5000K temperature and the potential to be driven at 800mA it should be an amazing LED. The only way you will ever see these in a light is from Don (McGizmo) I would guess. I can't wait to see if he has an offering using these, they would be perfect in a Sundrop head.


----------



## GaAslamp (Aug 29, 2011)

*Re: HDS Systems EDC # 15*



Harry999 said:


> I am interested. It would help explain why I find all my Zebralight 'c' models to be so satisfying to use. Thanks!



They ARE satisfying to use, aren't they? :twothumbs I think that's because their specific Rebel emitter's color balance (despite its 4000K CCT) where it counts the most is more like that of sunlight at around 5800K than a closer match to an ideal 4000K source would be. The latter would have given this emitter a higher CRI than 85 because CRI is measured against an ideal source of the same CCT, but then it would be less like sunlight than it already is. Does that make sense? I know I keep repeating this mantra, but CRI doesn't mean all that much.

As for the "peculiarities" of LEDs in regard to measuring CRI that I offered to explain, it's simple yet not that easy to explain, although I'll try.  Basically the spectra of phosphor-based white LEDs all have a large, narrow spike in the blue range, all have a significant deficiency in the cyan range (between blue and green), and most have a deficiency in the red range (unless they're warm white, in which case they probably have way too much red and orange relative to blue). Because CRI is measured against an ideal incandescent source of the same CCT, as mentioned above, the cyan deficiency that all LEDs have (some more than others, but it's always major)--being close to the blue end of the visible spectrum--has a more significant negative impact at higher CCTs than at lower CCTs. This means that just by adding more of the same phosphor blend to convert more of the underlying blue light (these LEDs are all actually blue with a phosphor blend on top, hence the narrow blue spike) to other wavelengths, for example, the CRI automatically goes up as the CCT goes down, all else being equal. The resulting LED's color rendering may be superior or inferior, but either way it will have a higher CRI because the cyan deficiency has by definition become less significant in the calculation. This is why most high-CRI LEDs tend to be warm white--it's relatively easy to achieve and doesn't necessarily mean a whole lot (such an LED could have superior color rendering, but it wouldn't be because of its CRI alone). This is a major flaw in using CRI to judge the color rendering accuracy of phosphor-based LEDs, especially for those who consider sunlight an ideal light source.

There are other issues that come into play that seriously call into question the usefulness of CRI even when comparing LEDs of the same CCT. I mentioned color balance above, and went into greater depth in the following posts, complete with spectral graphs and sample photos referenced from other posts of mine:

http://www.candlepowerforums.com/vb...-CRI-Release&p=3730719&viewfull=1#post3730719
http://www.candlepowerforums.com/vb...-CRI-Release&p=3730790&viewfull=1#post3730790



Bass said:


> No surprise that the CRI emitter is an XP-G, it was never likely to be anything else.



That's what I've been hearing for a long time, but I just kept wondering "Why does it have to be a Cree?" Not that I have anything against Cree, but they're mostly about maximum output rather than accurate color rendering, which gave me some doubt until now.



Bass said:


> I'm not personally a big fan of the CRI XP-G, it's too warm and makes everything look a muddy brown but this is just my personal opinion.



I haven't seen its spectrum, but if it's similar to that of other warm white high-CRI LEDs, then it probably just has a bit more blue to render cool colors a little better. At such a low CCT, everything is bound to look orangish or brownish in general, though, as there is still way more red and orange than blue.



Bass said:


> I'm with GaAslamp, it would be nice to see more lights using the Nichias and Luxeon Rebel High CRI emitters. I'm tempted to get a Zebralight just to try the Rebel emitter.



That's exactly what I did, and I was surprised by how accurately it renders colors (perhaps in part because I used to put blind faith in CRI, and 85 didn't seem all that high). It's not the sort of thing that would immediately hit most people in the face, so to speak, but when virtually every color you view with it looks practically as natural as you'd expect from viewing it under sunlight, the cumulative effect is kind of a revelation as LEDs go (or even incandescent flashlights--this one is more accurate to my eyes). While it has a yellowish cast, colors still look extremely well balanced otherwise. I can't guarantee that you'd like it as much, as we all see a bit differently, but I'm very impressed.



Bass said:


> As for the Nichia 183A-H1, it looks even better. With a 5000K temperature and the potential to be driven at 800mA it should be an amazing LED.



It certainly could be, I think. While I'm a bit concerned about its color balance from looking at its spectrum, its very high CRI notwithstanding, I'd take a chance on an HDS EDC equipped with this LED. With the SunDrop, I don't know--it's a bit out of my price range and also rather specialized. In addition, I'm pretty satisfied with my H51c right now, even if I'm always shooting for something that is potentially better.


----------



## the.Mtn.Man (Aug 29, 2011)

*Re: HDS Systems EDC # 15*



GaAslamp said:


> Previously I had shied away from the SSC P4-based high-CRI HDS flashlights because I wasn't satisfied with certain aspects of the emitter. I did, however, take notice of its neutral tint, as opposed to warm...



P4 is what was used in previous HDS lights, correct? Because if my high CRI Clicky is any indication it's much closer to warm than neutral.



GaAslamp said:


> I still can't justify (speaking only for myself, which should be implicit for what everybody says) paying that much for a flashlight that in my eyes can't match even my ZebraLight H51c in terms of color rendering.



Well, if it's color rendering you value then you should know that the H51c at 83 CRI lags behind the HDS offerings at 90+ CRI. Reading some of the early reports of the H51c, many users were disappointed that it didn't make colors "pop" like they were expecting from an emitter marketed as "high CRI" with some reporting that it was little better than a standard neutral white (read bluish) emitter.



GaAslamp said:


> I was surprised by how accurately it renders colors...



I appreciate the tone of your posts, but I think you're trying to make a subjective judgement sound like an objective one. A CRI of 90 will render colors more accurately than 80 at any color temperature. This is undeniable. What it sounds like is that you prefer cooler tints over warm and judge those colors to be more "accurate" even at a lower CRI and therefore technically less accurately rendered than a warm emitter with a higher CRI. That's fine, and your preference isn't wrong, but I think it's somewhat misleading to downplay the importance of a high CRI for accurate color rendering.

To put it another way, your "ideal source" is sunlight; my "ideal source" is incandescent (or late afternoon sun, what photographers call "the golden hour"). Thus we are each going to judge colors to be more "accurate" at our preferred color temperature even if the CRI is actually low relative to a high CRI source.

As far as I'm concerned, LED technology will have reached its zenith once we have emitters capable producing 3200K at a CRI of 100.


----------



## pjandyho (Aug 29, 2011)

*Re: HDS Systems EDC # 15*



the.Mtn.Man said:


> I appreciate the tone of your posts, but I think you're trying to make a subjective judgement sound like an objective one. A CRI of 90 will render colors more accurately than 80 at any color temperature. This is undeniable. What it sounds like is that you prefer cooler tints over warm and judge those colors to be more "accurate" even at a lower CRI and therefore technically less accurately rendered than a warm emitter with a higher CRI. That's fine, and your preference isn't wrong, but I think it's somewhat misleading to downplay the importance of a high CRI for accurate color rendering.
> 
> To put it another way, your "ideal source" is sunlight; my "ideal source" is incandescent (or late afternoon sun, what photographers call "the golden hour"). Thus we are each going to judge colors to be more "accurate" at our preferred color temperature even if the CRI is actually low relative to a high CRI source.
> 
> As far as I'm concerned, LED technology will have reached its zenith once we have emitters capable producing 3200K at a CRI of 100.


 
+100! I prefer warmer tints than cool ones. Although what GaAslamp said makes sense but it is like what you have said, a matter of personal preference. Honestly, after reading all the lengthy replies from GaAslamp, I can't help feeling that we are being "force-fed" into accepting that a high CRI of 85 with a cooler CCT is the way to go. I am sorry GaAslamp but this is what I felt after reading your posts. We all have our preferences so let's just leave it as that.


----------



## kaichu dento (Aug 29, 2011)

*Re: HDS Systems EDC # 15*



GaAslamp said:


> It's a regression from the point of view of those who wish for flashlights (as well as other light sources) to emulate daylight as closely as possible. Those who prefer alternatives to the light that we primarily evolved under are perfectly within their right to do so, of course (if somebody prefers deep purple light, that's OK too), but the preference for daylight tint and color rendering does make some sense (to me, everything else makes colors look wrong).


Once again the straw man of daylight comes up when in nature there are many presentations of light - all the way from a clear dawn to a cloudy day, sunny noon to blazing orange sunset, not to mention light of the moon, or even on a moonless night, the aurora and other favorites. 
Favorite light sources, now that we can be so picky, are really like preferences in cuisine, and we should accept that the guy that likes pizza is no more wrong than the one preferring sushi.

There is no superior light source, only a wider range of preferences, and to constantly try to rate LED's in comparison to mid-day sunlight is only helpful for those who prefer it over early/late sun, as many of us do.


the.Mtn.Man said:


> To put it another way, your "ideal source" is sunlight; my "ideal source" is incandescent (or late afternoon sun, what photographers call "the golden hour"). Thus we are each going to judge colors to be more "accurate" at our preferred color temperature even if the CRI is actually low relative to a high CRI source.
> 
> As far as I'm concerned, LED technology will have reached its zenith once we have emitters capable producing 3200K at a CRI of 100.


Great post and it was pretty nice to read after having been waiting all night to write a response with some of the same points made. I guess we'll be in the same state of bliss when we get that 3200k high CRI 100. :thumbsup:


----------



## bondr006 (Aug 29, 2011)

*Re: HDS Systems EDC # 15*

Although I don't like blueish purple cool tints, I don't like the other end of the spectrum either. I do not believe that overly yellow brown renders colors any better than blueish purple does. Being a retired Naval photographer I would not use either of those tints to render true color, but something in the middle. White balance is an amazing thing for true color rendering. I realize that tint is a subjective preference, but extremes either way do not truly render colors accurately.


----------



## GaAslamp (Aug 29, 2011)

*Re: HDS Systems EDC # 15*



the.Mtn.Man said:


> P4 is what was used in previous HDS lights, correct?



Correct, and more specifically it's either the S42180 (4000K CCT) or N42180 (3000K CCT). I've always been told that its CCT is 4000K, although some photos taken under its illumination appear orangish to the point where I can't be sure. The S42180 has a large peak in the orange range of the visible spectrum, just like the N42180 has, which means that orange is going to be emphasized either way, making it hard to tell its CCT due to the many unknown variables involved in examining photos taken by others.



the.Mtn.Man said:


> Because if my high CRI Clicky is any indication it's much closer to warm than neutral.



It may be "warm," it may be on the warm side of "neutral" due to manufacturing variation, it may be "neutral" (~4000K) with a peak in the orange range (as its spectral power distribution graph indicates), or a combination these possibilities. It is for this reason that I chose not to purchase this version of the flashlight, as impressed as I was--at least at the time--with its 93 CRI. In fact, this was what first made me doubt the usefulness of CRI in evaluating color rendering accuracy, and everything I've learned about CRI and from my experiences with using flashlights with varied spectra since then has confirmed my doubts.



the.Mtn.Man said:


> Well, if it's color rendering you value then you should know that the H51c at 83 CRI lags behind the HDS offerings at 90+ CRI.



I'm not sure where you got 83, but it is rated by the manufacturer as having a minimum of 80 and a typical value of 85. A few months ago that would have meant something to me, but not anymore.



the.Mtn.Man said:


> Reading some of the early reports of the H51c, many users were disappointed that it didn't make colors "pop" like they were expecting from an emitter marketed as "high CRI" with some reporting that it was little better than a standard neutral white (read bluish) emitter.



The purpose of color rendering accuracy is not to make colors "pop" (more than they should) but to render them accurately. To make them really "pop" requires exaggerating certain wavelengths, usually in the red range of the visible spectrum. I find that my H51c makes many colors "pop" a little bit more than the typical neutral white LED due to its higher red output, but at the same time it's not overly red to the point where it adds any more "pop" or "richness" than objects should have like most (not all) other high-CRI LEDs do (sort of like putting on BluBlockers ).

Although I fully acknowledge that we all see a little differently (or maybe even a lot in some cases), I wouldn't be surprised if somebody expressed disappointment in the H51c/SC51c (same LED) even if their eyes and visual perception were exactly like mine. That's because they may be looking for something different, such as more "pop" or a specific tint rather than what I personally define as accuracy. I'm quite pleased that it gives me the accuracy--relative to sunlight--that I'm seeking, or something pretty darn close to it. I've always found artificial light sources lacking in this respect, whether they "pop" or not.



the.Mtn.Man said:


> I appreciate the tone of your posts, but I think you're trying to make a subjective judgement sound like an objective one.



Although what I've said does include both subjective and objective elements, I assure you that there was no deliberate attempt to confuse the two. Critical readers should be able to sort them out, I think.



the.Mtn.Man said:


> A CRI of 90 will render colors more accurately than 80 at any color temperature. This is undeniable.



It is neither undeniable nor self-evident. I think that my arguments and examples are sufficient to introduce doubt with regard to placing blind faith (as I once did, effectively) in CRI as an indicator of color rendering accuracy. I tried to do it in a way that laymen (myself included) could understand, but there are people with much better credentials who are thinking the same thing, as evidenced in the following links:

http://en.wikipedia.org/wiki/Color_rendering_index#Criticism_and_resolution
http://www.knt.vein.hu/staff/schandaj/SJCV-Publ-2005/521.pdf



the.Mtn.Man said:


> What it sounds like is that you prefer cooler tints over warm and judge those colors to be more "accurate" even at a lower CRI and therefore technically less accurately rendered than a warm emitter with a higher CRI. That's fine, and your preference isn't wrong, but I think it's somewhat misleading to downplay the importance of a high CRI for accurate color rendering.



My personal preferences are no secret, and I don't hide the fact that they form the perspective from which I discuss these issues. This is perfectly natural since this particular discussion started when somebody accused those who have such a perspective of making no sense (which I suppose implies that those who have other perspectives are making sense :ironic. In response, I tried my best to make sense out of my (and others') perspective, without invalidating other perspectives (we all have different preferences and goals).

As for CRI, there are many good reasons to rightfully downplay its importance--particularly where phosphor-based LEDs are concerned. I've given a few of my own, as well as references to the harsh criticisms of professionals for everybody here to peruse. If anything that I or they said is wrong, then by all means refute it--I'd rather be corrected than mistaken while still believing that I'm right (and misleading others).

Generally speaking, CRI is crude and only useful for certain cases. Because of this and the fact that by definition it only measures accuracy against an ideal reference *of the same CCT*, it is virtually useless as an indicator of the color rendering accuracy of, say, 4000K LEDs against an ideal 5800K source such as sunlight. Does this make sense? I'm saying that if a specific 4000K LED renders colors very much like my ideal of 5800K sunlight does (with some flaws, of course), then its CRI--due to the very definition of CRI--will in all likelihood be lower than that of other 4000K LEDs that come closer to the theoretical ideal 4000K source. The problem is that I don't care about the ideal 4000K source while CRI--by definition--does, therefore the higher CRIs of these other LEDs means nothing to me. This would be true even if CRI were a PERFECT method of determining accuracy, but in reality it is far, FAR from that.


----------



## the.Mtn.Man (Aug 29, 2011)

*Re: HDS Systems EDC # 15*



bondr006 said:


> Although I don't like blueish purple cool tints, I don't like the other end of the spectrum either. I do not believe that overly yellow brown renders colors any more accurately than blueish purple does.



It depends on what you mean by "accurate". A bluish/purplish tint will not render colors "accurately" if your reference is a 3200K incandescent.

I think where this discussion is getting lost is that people are using "accurate" to mean "preferred".


----------



## the.Mtn.Man (Aug 29, 2011)

*Re: HDS Systems EDC # 15*



GaAslamp said:


> I assure you that there was no deliberate attempt to confuse the two.



Then I guess it was inadvertent.



GaAslamp said:


> I think that my arguments and examples are sufficient to introduce doubt with regard to placing blind faith (as I once did, effectively) in CRI as an indicator of color rendering accuracy.



But CRI _is_ an indicator of color rendering accuracy. That's what "CRI" means: color rendering index as measured on a 100 point scale. A CRI of 100 is as accurate as it gets at any color temperature.

That said, I think you do have a valid point that someone who prefers cool white isn't going to necessarily like how colors look with a warm high CRI emitter even if the latter offers more accurate color rendering (relative to its reference).


----------



## GaAslamp (Aug 29, 2011)

*Re: HDS Systems EDC # 15*



pjandyho said:


> Honestly, after reading all the lengthy replies from GaAslamp, I can't help feeling that we are being "force-fed" into accepting that a high CRI of 85 with a cooler CCT is the way to go. I am sorry GaAslamp but this is what I felt after reading your posts. We all have our preferences so let's just leave it as that.



I'm sorry that you got this impression because it was not my intention. All I'm saying in this regard is that I've found a specific LED--not just any LED of a certain CRI and CCT, but one specific model--that appears to my eyes to render the vast majority of colors in a manner very similar to that of 5800K sunlight (despite having too much yellow that serves to bring its CCT down to 4000K). This only applies to those who are looking for the same thing that I am.

However, in a broader sense I used this specific example (and others) to show why CRI is not the most meaningful and useful indicator of color rendering accuracy (especially for LEDs). Two LEDs that have the exact same CRI and CCT can still have extremely different spectra and therefore render colors very differently. If one could be judged as subjectively more accurate than the other, which I think is pretty likely, then CRI didn't really tell us much. With different CCTs, the result is totally meaningless for the reasons I gave earlier.



kaichu dento said:


> Once again the straw man of daylight comes up when in nature there are many presentations of light - all the way from a clear dawn to a cloudy day, sunny noon to blazing orange sunset, not to mention light of the moon, or even on a moonless night, the aurora and other favorites.



Straw man?  Anyway, I think you're reading things into what I said. I never said that my ideal light source was the ONLY correct one. You're right that there are many forms of natural light, although I must point out that sunlight (as an average that excludes the extremes) is what I've personally spent more time under (even when inside) during the course of my life. What others have experienced and may prefer for whatever reasons (if any) is just as valid, and talking about my preferences in no way implies otherwise.



kaichu dento said:


> There is no superior light source, only a wider range of preferences, and to constantly try to rate LED's in comparison to mid-day sunlight is only helpful for those who prefer it over early/late sun, as many of us do.



That is true, but my point was that CRI does not do this in most cases. Some have tried to tell others that a higher CRI always means superior color rendering accuracy, but that is NOT what it means--it means something very specific and narrow in scope, and on top of that it's pretty crude and inherently ill-suited for use with LEDs (all backed up earlier with solid reasoning that nobody has bothered to refute).

Additionally, there is also the point about looking at the whole picture of color balance (relative spectral power distribution) rather than just CRI and CCT, which captures only a tiny fraction of the information about a light source. I have to say that it has always been difficult to broaden the perspectives of others, but it's even more difficult, as well as ironic, when everybody pretty much ignores all of that and focuses on a tiny aspect of the discussion where they think (mistakenly in this case) that my perspective is overly narrow. :laughing:


----------



## Sparky's Magic (Aug 29, 2011)

*Re: HDS Systems EDC # 15*

@GaAslamp,

Thank you!

Before Z/L came out with their High CRI, I emailed Lillian with a query as to the specs. of the emitter they intended to use and her reply was that they had several to choose from and that they would probably go with the Philips Luxeon Rebel, "...because it looks the best." Enough said!


----------



## TwitchALot (Aug 29, 2011)

*Re: HDS Systems EDC # 15*



the.Mtn.Man said:


> But CRI _is_ an indicator of color rendering accuracy. That's what "CRI" means: color rendering index as measured on a 100 point scale. A CRI of 100 is as accurate as it gets *at any color temperature.*


 
This is the key. CRI is a very technical definition, and as previously stated, an 80 CRI rated emitter CAN render color better than a 90 CRI emitter. 



Gaslamp said:


> However, in a broader sense I used this specific example (and others) to show why CRI is not the most meaningful and useful indicator of color rendering accuracy (especially for LEDs). Two LEDs that have the exact same CRI and CCT can still have extremely different spectra and therefore render colors very differently. If one could be judged as subjectively more accurate than the other, which I think is pretty likely, then CRI didn't really tell us much. With different CCTs, the result is totally meaningless for the reasons I gave earlier...
> 
> That is true, but my point was that CRI does not do this in most cases. Some have tried to tell others that a higher CRI always means superior color rendering accuracy, but that is NOT what it means--it means something very specific and narrow in scope, and on top of that it's pretty crude and inherently ill-suited for use with LEDs (all backed up earlier with solid reasoning that nobody has bothered to refute).
> 
> Additionally, there is also the point about looking at the whole picture of color balance (relative spectral power distribution) rather than just CRI and CCT, which captures only a tiny fraction of the information about a light source. I have to say that it has always been difficult to broaden the perspectives of others, but it's even more difficult, as well as ironic, when everybody pretty much ignores all of that and focuses on a tiny aspect of the discussion where they think (mistakenly in this case) that my perspective is overly narrow.



But... this is where I'll disagree. The definition of CRI is not perfect. But there is no perfect solution. I think CRI is a very useful tool provided that people understand what it actually entails. Although it's true that two LED's with the same CRI and CCT can have different spectra, because CRI is an average of many colors, and there is a cap (100), the spectra cannot be so skewed that one color is rendered extremely poorly and still have a very high CRI.

That is, 70 CRI can be relatively meaningless, because a low average like that allows for significant deviation in the accurate rendering of the colors involved - red could be rendered perfectly accurately and two other colors rendered **** poor, and you could end up with that average. But the nature of 90+ CRI must be that all of the colors are represented somewhat well (though not perfectly), because if one color were represented very very poorly, the average couldn't be that high. Of course, the more colors involved, the more deviation there can be. And it is still true that the spectral output can vary widely even for 90+ CRI emitters. But provided that spectral distribution is available, anyone can freely choose what emitter will suit them best. One 95 CRI emitter may be slightly lacking on the yellow but be more accurate in the blue. Another 95 CRI emitter may be the opposite. So long as consumers can tell from the datasheet and pick which one suits them best, I don't see a problem here.

CRI is not a perfect measure by any means. But what is? Especially when some prefer some CCT's over others (read, spectral outputs)? What if I don't care if lighter stuff is rendered well if the darker browns and greens are rendered perfectly? Does it matter that the CRI may be low if the spectral output is good where I want it to be? Given the many preferences of people, I'm not convinced there is a perfect measure of color rendering. The perfect color rendering will depend on the CCT desired by the user and the exact colors desired to be accurately rendered. Furthermore, the accurate rendering of some colors may be more important than the accurate rendering of other colors. So what measure do you propose to give such information in a simple way for general users, if not the current one? Do you think spectral outputs of the sun at various color temperatures and then the spectral output of the LED would really be a good thing for most consumers? Would they be able to distinguish what matters, what doesn't, and more importantly, _how much it matters_? I don't believe so.

Ultimately, the lack of perfection doesn't mean that CRI is a useless measure. It gives a user who understands what it is a broad and general measure for how accurately it represents colors compared to the correct light source (which varies by CCT). Is it specific enough for users who really want a couple colors to be perfect (maybe mechanics or electrical engineers)? Absolutely not. But for the general user who simply wants a ballpark guesstimate for how good the color rendering will be compared to an ideal light source at the same CCT, it's very good, in my opinion, even though there can be variation in the spectral output because it is an average.

Given the complexity of mixtures of colors, a spectral output comparison may be useful to some extent (even though it'd be nearly impossible to interpret for complex colors at the individual level - so it too has its drawbacks), but that doesn't mean CRI has no place for the consumer.


----------



## GaAslamp (Aug 29, 2011)

*Re: HDS Systems EDC # 15*



the.Mtn.Man said:


> But CRI _is_ an indicator of color rendering accuracy.



Indeed it is, although it is a poor, outdated one that doesn't necessarily take the most important factors into account--or enough of them, for that matter. Many lighting professionals have been trying to replace it, but it remains popular because it's been around for a long time, it's pretty simple (one of its shortcomings), and quite frankly manufacturers (not necessarily of LEDs but mostly fluorescent lighting) have learned how to "game the system" by finding ways to increase CRI without going through the more difficult process of significantly improving color rendering (sort of like studying intensely for a very specific standardized test but not truly learning a whole lot in the process ).



the.Mtn.Man said:


> That's what "CRI" means: color rendering index as measured on a 100 point scale. A CRI of 100 is as accurate as it gets at any color temperature.



Unless I'm misinterpreting something, you seem to have a lot of faith in the CRI system, despite the major flaws that I explained in some detail, as well as the heavy criticism from professionals in this field who have been trying to greatly improve it or replace it entirely with something different and better. Do you really think it's that good? I don't think that having a higher CRI necessarily means that an LED renders colors more accurately than another, even given the same CCT.



the.Mtn.Man said:


> That said, I think you do have a valid point that someone who prefers cool white isn't going to necessarily like how colors look with a warm high CRI emitter even if the latter offers more accurate color rendering (relative to its reference).



Right, CRI only has any meaning, such as it is, within the same CCT. If those who prefer the color rendering of ideal high-CCT sources can't get their hands on high-CRI flashlights with high CCTs, then they'll have to settle for something with a lower CCT that renders colors the way they prefer (for the most part). Such an LED may, as a direct result, have a lower CRI than expected because of the way that CRI is defined. For these people, the lower CRI is therefore meaningless--all that really counts is the spectrum and how it renders colors for them. What I believe on top of that, however, is that this principle applies more broadly to all cases, especially given the inherent weakness of CRI as a tool for measuring color rendering accuracy (for LEDs in particular).


----------



## TwitchALot (Aug 29, 2011)

*Re: HDS Systems EDC # 15*



GaAslamp said:


> Indeed it is, although it is a poor, outdated one that doesn't necessarily take the most important factors into account--or enough of them, for that matter. Many lighting professionals have been trying to replace it, but it remains popular because it's been around for a long time, it's pretty simple (one of its shortcomings), and quite frankly manufacturers (not necessarily of LEDs but mostly fluorescent lighting) have learned how to "game the system" by finding ways to increase CRI without going through the more difficult process of significantly improving color rendering (sort of like studying intensely for a very specific standardized test but not truly learning a whole lot in the process ).
> 
> Unless I'm misinterpreting something, you seem to have a lot of faith in the CRI system, despite the major flaws that I explained in some detail, as well as the heavy criticism from professionals in this field who have been trying to greatly improve it or replace it entirely with something different and better. Do you really think it's that good? I don't think that having a higher CRI necessarily means that an LED renders colors more accurately than another, even given the same CCT.


 
A "visual industry professional" would be better off with a computer program that can take a spectral distribution and output the ability of said distribution to accurately render _every color_ compared to an ideal light source for any CCT. That is, you plug in the distribution, the CCT you want to compare it to (and maybe the light source to be used as a reference, so you'd need a database for that), and it spits out: Magenta: 90% accuracy, Cyan: 85% accuracy, Pink: 40% accuracy, etc for every conceivably useful color. But is this really practical or useful from a consumer standpoint? Does this mean the current CRI is a *useless* measure? I don't believe so - on either count.


----------



## GaAslamp (Aug 29, 2011)

*Re: HDS Systems EDC # 15*



Norm said:


> Agreed, please feel free to open a new thread on the topic.
> 
> Any further OT posts will be deleted.



Oops, I read this late (finished my post after dinner ). Sorry.  Transfer to a new thread or delete as you please.


***************************************************************




TwitchALot said:


> This is the key. CRI is a very technical definition, and as previously stated, an 80 CRI rated emitter CAN render color better than a 90 CRI emitter.



Right, and for completeness, while an 80 CRI emitter USUALLY does not render color better than a 90 CRI emitter, in some cases it CAN.



TwitchALot said:


> But... this is where I'll disagree. The definition of CRI is not perfect. But there is no perfect solution. I think CRI is a very useful tool provided that people understand what it actually entails.



We're still in agreement here, although I should point out that last thing you said seems to be a major issue.



TwitchALot said:


> Although it's true that two LED's with the same CRI and CCT can have different spectra, because CRI is an average of many colors, and there is a cap (100), the spectra cannot be so skewed that one color is rendered extremely poorly and still have a very high CRI.



That's all true (and any theoretical exceptions would probably look like nothing we've ever seen in real life).



TwitchALot said:


> That is, 70 CRI can be relatively meaningless, because a low average like that allows for significant deviation in the accurate rendering of the colors involved - red could be rendered perfectly accurately and two other colors rendered **** poor, and you could end up with that average. But the nature of 90+ CRI must be that all of the colors are represented somewhat well (though not perfectly), because if one color were represented very very poorly, the average couldn't be that high.



That's what I always thought...until I saw this 90 CRI (3000K CCT) LED:






Obviously the cyan range is rendered very poorly, even for an LED (of any CRI), and it looks to be a bit light in the red range for a 3000K LED on top of that (makes sense because of the major lack of cyan). You seem to be quite familiar with CRI. What do you make of this? For me, it just plain makes me doubt the value of CRI and want to look at spectral graphs instead.



TwitchALot said:


> Of course, the more colors involved, the more deviation there can be. And it is still true that the spectral output can vary widely even for 90+ CRI emitters.



And I can provide examples at request.



TwitchALot said:


> But provided that spectral distribution is available, anyone can freely choose what emitter will suit them best. One 95 CRI emitter may be slightly lacking on the yellow but be more accurate in the blue. Another 95 CRI emitter may be the opposite. So long as consumers can tell from the datasheet and pick which one suits them best, I don't see a problem here.



My point exactly, although convincing people that it's a good idea is not always easy.



TwitchALot said:


> CRI is not a perfect measure by any means. But what is?



Nothing can be perfect, but we can do better.



TwitchALot said:


> Given the many preferences of people, I'm not convinced there is a perfect measure of color rendering. The perfect color rendering will depend on the CCT desired by the user and the exact colors desired to be accurately rendered. Furthermore, the accurate rendering of some colors may be more important than the accurate rendering of other colors. So what measure do you propose to give such information in a simple way for general users, if not the current one?



There is no simple way, in my opinion, but for those who are picky, which is a lot of us , I've been trying to show that CRI and CCT aren't enough, as well as provide sufficient background so that we can more meaningfully describe what we're seeing in spectral terms, which I've referred to as color balance. So when I say "You know, this particular 4000K 85 CRI LED renders most colors remarkably like 5800K sunlight does, except for the extra yellow content" people won't just blithely respond with something like "Yeah, but this other 4000K LED is 93 CRI so it's more accurate" or maybe just "85 CRI is not very high, so you must be wrong." :shakehead

Unfortunately, I'm not getting through to everybody. People seem to have a lot of faith in CRI even though they don't seem to understand it well, so I've been spending most of my words on attacking CRI as though it ate my pet hamster. :laughing: And as a result of this and other petty little arguments I really didn't want to get into (honestly), some people find me profoundly irritating. :nana: I'm just trying to help promote the sharing of knowledge and understanding between people...really.... :duck:



TwitchALot said:


> Ultimately, the lack of perfection doesn't mean that CRI is a useless measure. It gives a user who understands what it is a broad and general measure for how accurately it represents colors compared to the correct light source (which varies by CCT). Is it specific enough for users who really want a couple colors to be perfect (maybe mechanics or electrical engineers)? Absolutely not. But for the general user who simply wants a ballpark guesstimate for how good the color rendering will be compared to an ideal light source at the same CCT, it's very good, in my opinion, even though there can be variation in the spectral output because it is an average.



I still haven't seen anything you've said that I disagree with (except specifically for how good 90 CRI LEDs are supposed to be). Now try to actually *convince* people. oo: Good luck! :wave:



TwitchALot said:


> Given the complexity of mixtures of colors, a spectral output comparison may be useful to some extent (even though it'd be nearly impossible to interpret for complex colors at the individual level - so it too has its drawbacks), but that doesn't mean CRI has no place for the consumer.



True enough, but using spectral graphs is more interesting and may help us understand what we and others are seeing, allowing us to make more informed decisions about purchases (according to our own personal preferences), for example.


----------



## Norm (Aug 29, 2011)

*Re: HDS Systems EDC # 15*

GaAslamp last thread reinstated after DM51's move. 
Norm


----------



## TwitchALot (Aug 29, 2011)

Gaslamp said:


> That's what I always thought...until I saw this 90 CRI (3000K CCT) LED:
> 
> 
> 
> ...



I'm not an expert here perse, but let me shoot. 

What do I make of it? That's A-Okay! Why?


Recall that light is not interpreted at equal values due to the varying sensitivity of our rods and cones (ie. the Purkinje Shift). The end game here is that in darkness, blue is perceived to be brighter than say, red (at the same luminous flux) due to the way our eyes and brains work. The obvious implication of this is that if you're going to be lacking a spectral region but still want high CRI, _blue is the region you want to lack, because we are more sensitive to to it. _The result is that even though the cyan range is lacking in spectral output, the fact that we need less of it to perceive the same level of brightness offsets most of the loss in color rendering due to our increased sensitivity of light in that wavelength. Thus, I propose that the issue of color rendering is not as simple as wanting a broad and even spectral output. Because our eyes are sensitive to some colors more than others under varying conditions, an uneven spectra designed to mitigate this oversensitivity in some cases and undersensitivity in other cases is important. If the cyan region were significantly increased in amplitude, our increased sensitivity to blue in low light, I suspect, would wash out other colors, "fading" them out in intensity, in a sense.



> Nothing can be perfect, but we can do better.
> 
> There is no simple way, in my opinion, but for those who are picky, which is a lot of us , I've been trying to show that CRI and CCT aren't enough, as well as provide sufficient background so that we can more meaningfully describe what we're seeing in spectral terms, which I've referred to as color balance. So when I say "You know, this particular 4000K 85 CRI LED renders most colors remarkably like 5800K sunlight does, except for the extra yellow content" people won't just blithely respond with something like "Yeah, but this other 4000K LED is 93 CRI so it's more accurate" or maybe just "85 CRI is not very high, so you must be wrong." :shakehead
> 
> I still haven't seen anything you've said that I disagree with (except specifically for how good 90 CRI LEDs are supposed to be). Now try to actually *convince* people. oo: Good luck! :wave:



How so? I proposed a more rigorous standard that is less practical for the vast majority of people and difficult to design, but far more accurate. My argument is this: You seem to discount CRI as a useful measurement because it is flawed or isn't accurate enough or isn't always informative in the way we think it should be.

I think it is still a useful tool despite its imperfections, and other methods too have their own issues that aren't practically negotiable in such that they'd be useful to a broad audience. A CRI and CCT value, in my view, are useful tools for measurement that are indicative of the ability of a light source to imitate a comparable ideal source - provided you understand their limitations. For most consumers, I think it is a useful "ballpark figure" for color rendering (for a given CCT). 



> True enough, but using spectral graphs is more interesting and may help us understand what we and others are seeing, allowing us to make more informed decisions about purchases (according to our own personal preferences), for example.



But it is not a trivial as that, as I just pointed out. Furthermore, once you start getting into mixtures of colors, analysis becomes far far more complicated. Let's take a color like purple, for example - red and blue. Under low-light conditions, we are more sensitive to blue than red. So to really get a good and even purple, we need not a 1:1 mixture as you would expect for say, paint, but more blue than red because equal amounts of red and blue result in excess blue compared to red (due to our increased sensitivity to it). In this case, you would get a "bluer" shade of purple - but not true purple perse. 

Now look at the spectral output of an LED. How easy is it to visually integrate the various spectrum to come up with how purple that LED should make a truly purple object look? Is it heavy on the blue side or the red? If the blue is lacking, is it lacking enough to balance for the increased sensitivity for blue due to our eyes, or is it lacking too much such that the red will overpower the blue and thus you get a pinker shade of purple?

The point is, this color rendering stuff - spectral output or no, is non-trivial. Virtually impossible given the changes in perception due to flux ala Kruihof. To say that CRI and CCT aren't useful measures because their flaws to me is a little silly, because other alternatives can be far less comprehensible - and therefore useful - if you really want to get into the nooks and crannies of it.


----------



## kaichu dento (Aug 30, 2011)

*Re: HDS Systems EDC # 15*



GaAslamp said:


> Straw man?  Anyway, I think you're reading things into what I said. I never said that my ideal light source was the ONLY correct one. You're right that there are many forms of natural light, although I must point out that sunlight (as an average that excludes the extremes) is what I've personally spent more time under (even when inside) during the course of my life.


It's just that the daylight reference tends to get trotted out with little acknowledgement to what it actually entails. Whether one is under direct sun, what time of day it is, the angle in the sky, all refer to color temperatures which are constantly varying and to suggest that noonday light is the truly accurate source by which to judge all others is in my opinion a straw man argument generally used in cases where one wishes to settle all other light sources invalid.

I'm glad to see from some of your subsequent statements that it was not your intention, but you evidently love to debate and sometimes the debate ends up being the end in itself. My only objection is when I read a post that suggests that one persons preference is more correct than all others, but then I never could understand the whole Ford vs. Chevy thing either.


----------



## jtr1962 (Aug 30, 2011)

*Re: HDS Systems EDC # 15*

Let me chime in here. CRI is basically a measure of how well a light source compares to a blackbody _at the same CCT_. As such, it is only valid for comparing light sources at the same CCT. A 70 CRI 3000K light source is inferior to a 100 CRI 3000K light source, both by the CRI metric, and in the real world. Sure, 70 CRI might be _good enough_ in many situations, but it still won't render colors the same as a 100 CRI light source. The problem with CRI is when you compare light sources at different CCTs. To me personally an 85 CRI, 5000K triphosphor fluorescent light source blows away a 100 CRI, 2800K incandescent light source. That's the problem with CRI-it fails to take into account the CCT. A 1500K candle has a CRI of 100 because it's a black body radiator. So does a 1000K toaster heating element for that matter. I doubt anyone would use either if they were doing color sensitive work. At the extremes, CRI fails badly because the spectrum of a black body will severely be lacking in either long or short wavelengths, depending upon if you have a very high or very low CCT source. And such a source will have an overabundance of wavelengths on the other side of the spectrum. End result is many close colors will be indistinguishable from each other, as well as "wrong" in appearance. For example, I simply can't tell navy blue and black socks apart under incandescent light, no matter how bright. And yet the same task is easy even with a keychain LED light.

LEDs complicate matters even more. It's been known for a while that LED sources with lower CRI numbers than some fluorescent sources actually render colors better to the vast majority of people. The reason why is because LED spectra are continuous, albeit with humps and valleys, whereas fluorescent spectra are spiky. It's easy to "game" the CRI metric be carefully choosing the center wavelengths of phosphors even though it doesn't necessary result in an improvement in how colors are rendered. To me a 75 CRI 5000K LED beats the pants off an 85 CRI 5000K triphosphor fluorescent. It gets even worse when looking at RGB LED sources. Here you might have CRI in the 20s, and yet the vast majority of people prefer the RGB LED source over both phosphor white LED and incandescent. This is why we need a better metric, one which also takes CCT into account. Fortunately, such a metric is under development. It is called the color quality scale. This paper explains the rationale behind the CQS. Basically, the CQS penalizes a light source if it has too low or too high a CCT. Traditional incandescent only suffers a penalty of a few points. Something like candlelight or dimmed incandescent would suffer a much more severe penalty. 2000K HPS, which already has a fairly poor CRI of 62 or so, would end up with a CQS score of around 50. Dimmed incandescent and/or candlelight might have a CQS score of about 60 or 65.

Note that neither CQS nor CRI are measures of how aesthetically appealing a light source is. Some people love the ambiance of candlelight (not me-I personally can't stand it), but it's just not a great source for distinguishing colors. Maybe that's why some people love it-it tends to hide uneven skin tones and other blemishes. In my opinion, the CQS didn't go far enough penalizing light sources with too high or too low a CCT. Nevertheless, it's at least a step in the right direction compared to CRI. There are some metrics which compare a spectrum to an even energy spectrum, and penalize anything which deviates from that. Again, this isn't ideal because an even energy spectrum is an artificial construct which doesn't exist in nature. Maybe a metric which measures how much the spectrum of a light source deviates from ~5500K sunlight (which incidentally isn't a perfect blackbody spectrum) might be ideal. While there can be arguments that CCT of sunlight isn't a constant, fact is for most of the day it is 5000K or above. This is what our visual systems are optimized for, even though they function at much higher or lower CCTs. If there's going to be any standard against which other light sources are measured, then noon equatorial sunlight should probably be that standard. Complicating things further though is the fluoresence of some objects under UV from sunlight. These objects then will look different under a light source which only matches sunlight in the visual spectrum, but not the UV spectrum. In practice though this probably doesn't matter because most things which humans artifically light don't fluorese much.


----------



## the.Mtn.Man (Aug 30, 2011)

*Re: HDS Systems EDC # 15*



TwitchALot said:


> CRI is a very technical definition, and as previously stated, an 80 CRI rated emitter CAN render color better than a 90 CRI emitter.



It depends on what you mean by "better". If you mean "objectively more accurate" then, no, an 80 CRI is not "better" than 90. If you mean "what I prefer" then you may be correct depending on the color temperature of each source. So, for example, if you prefer a cool source then that will look "better" to you than a warm source even if the latter has a higher CRI and therefore renders colors objectively more accurately.

I think people need do a better job distinguishing between "accuracy" and "preference".


----------



## the.Mtn.Man (Aug 30, 2011)

*Re: HDS Systems EDC # 15*



GaAslamp said:


> Unless I'm misinterpreting something, you seem to have a lot of faith in the CRI system, despite the major flaws that I explained in some detail, as well as the heavy criticism from professionals in this field who have been trying to greatly improve it or replace it entirely with something different and better. Do you really think it's that good? I don't think that having a higher CRI necessarily means that an LED renders colors more accurately than another, even given the same CCT.



I may not be up on all the technical aspects, but I do know that when I pull out my high CRI Clicky, I'm not longer jealous of much better my kids' dollar store incans look.


----------



## ledaladdin (Aug 30, 2011)

*Color Temperature (K) & Color Rendering (CRI)*


*What is Color Rendering Index (CRI)? *

Color rendering describes how a light source makes the color of an object appear to human eyes and how well subtle variations in color shades are revealed. The Color Rendering Index (CRI) is a scale from 0 to 100 percent indicating how accurate a "given" light source is at rendering color when compared to a "reference" light source. 

The higher the CRI, the better the color rendering ability. Light sources with a CRI of 85 to 90 are considered good at color rendering. Light sources with a CRI of 90 or higher are excellent at color rendering and should be used for tasks requiring the most accurate color discrimination.

It is important to note that CRI is independent of color temperature (see discussion of color temperature). Examples: A 2700K ("warm") color temperature incandescent light source has a CRI of 100. One 5000K ("daylight") color temperature fluorescent light source has a CRI of 75 and another with the same color temperature has a CRI of 90. 

To further understand the physics of color rendering, we need to look at spectral power distribution.

The visible part of the electromagnetic spectrum is composed of radiation with wavelengths from approximately 400 to 750 nanometers. The blue part of the visible spectrum is the shorter wavelength and the red part is the longer wavelength with all color gradations in between.

Spectral power distribution graphs show the relative power of wavelengths across the visible spectrum for a given light source. These graphs also reveal the ability of a light source to render all, or, selected colors.

Below see how a typical spectral power distribution graph for daylight.






Notice the strong presence (high relative power) of ALL wavelengths (or the "full color spectrum"). Daylight provides the highest level of color rendering across the spectrum. 
Compare the daylight spectral power distribution with that for a particular fluorescent lamp. 






The most obvious difference is the generally lower level of relative power compared to daylight - - except for a few spikes. All wavelengths (the "full spectrum) are again present but only certain wavelengths (the spikes) are strongly present. These spikes indicate which parts of the color spectrum will be emphasized in the rendering of color for objects illuminated by the light source. This lamp has a 3000K color temperature and a CRI of 82. It produces a light that is perceived as "warmer" than daylight (3000K vs. 5000K). It's ability to render color across the spectrum is not bad, but certainly much worse than daylight. Notice the deep troughs where the curve almost reaches zero relative power at certain wavelengths. 

Here is another fluorescent lamp. 





This spectral power distribution looks generally similar to the one above except it shows more power at the blue end of the spectrum and less at the red end. Also, there are no low points in the curve that come close to zero power. This lamp has a 5000K color temperature and a CRI of 98. It produces light that is perceived as bluish white (similar to daylight) and it does an excellent job of rendering colors across the spectrum.
Above are links to linear and compact fluorescent light bulbs from Topbulb that have a CRI of 90 or higher. If you want a high color rendering bulb to produce light perceived as warm white, choose a bulb with a color temperature of 3000K or 3500K. If you want a high color rendering bulb to produce light perceived as white, choose a bulb with a color temperature of 4000K. For a bulb that simulates daylight, choose a color temperature of 5000K or higher. 
Note: all incandescent and halogen light bulbs, by definition, have a CRI close to 100. They are excellent at rendering color. However, except for some halogen bulbs, most incandescents produce a warm 2800K color temperature. The only way to achieve the bluish white appearance of daylight with incandescent bulbs is to use bulbs coated with neodymium. However, these bulbs have a CRI much lower than 90. They are not good for accurate color rendering across the spectrum.


----------



## DM51 (Aug 30, 2011)

ledaladdin, quite a lot of your post has been excised, as it contained numerous advertising links which are not permitted here. Other posts you have made containing similar links have not been approved. 

Please read: Advertising Policies and Procedures for CPF and CPFMP​


----------



## TwitchALot (Aug 30, 2011)

*Re: HDS Systems EDC # 15*



the.Mtn.Man said:


> It depends on what you mean by "better". If you mean "objectively more accurate" then, no, an 80 CRI is not "better" than 90. If you mean "what I prefer" then you may be correct depending on the color temperature of each source. So, for example, if you prefer a cool source then that will look "better" to you than a warm source even if the latter has a higher CRI and therefore renders colors objectively more accurately.
> 
> I think people need do a better job distinguishing between "accuracy" and "preference".


 
It most certainly can be. And by better I mean accurate, since we are talking about accurate color rendering. You still don't understand how CRI works or how it is measured. Because it is based off of a certain light source as a reference, an LED can perfectly mimic that light source - 100 CRI. But guess what? If your reference light source distorts certain colors due to its CCT, the then the color rendering ability of that reference is imperfect. Therefore, *your LED perfectly mimics a light source that doesn't accurately render colors. *This is 100 CRI. It is not perfect color rendering. 

CRI is a only a measure of how well your artificial light source imitates a reference standard (depending on CCT). What if your reference light source distorts certain colors? Do we want to _perfectly_ mimic an imperfect color rendering light source?



Ledaladdin said:


> The Color Rendering Index (CRI) is a scale from 0 to 100 percent indicating how accurate a "given" light source is a*t rendering color when compared to a "reference" light source. *
> 
> The higher the CRI, the better the color rendering ability.



See this? That is the key. If your reference light source does not render colors accurately, the CRI is moot if you are interested in color rendering accuracy. The second underlined section does NOT necessarily follow from the first bolded part.


----------



## jtr1962 (Aug 30, 2011)

*Re: HDS Systems EDC # 15*



TwitchALot said:


> It most certainly can be. You still don't understand how CRI works or how it is measured. Because it is based off of a certain light source as a reference, an LED can perfectly mimic that light source - 100 CRI. But guess what? If your reference light source distorts certain colors due to its CCT, the then the color rendering ability of that reference is imperfect. Therefore, *your LED perfectly mimics a light source that doesn't accurately render colors. *
> 
> CRI is a only a measure of how well your artificial light source imitates a reference standard (depending on CCT). What if your reference light source distorts certain colors? Do we want to _perfectly_ mimic an imperfect color rendering light source?


Yes, exactly. In the case of CRI the reference source is a blackbody for CCTs under 5000K and CIE Standard Illuminant D (i.e. daylight) for CCTs over 5000K. Obviously black bodies at low color temperatures bias colors towards the red end of the spectrum. This makes distinguishing fine degradations of cooler colors hard or impossible.

It gets worse. A higher CRI, even at the same CCT, doesn't necessarily mean the light source is better. This paper has a good example on page 5. One light source has a CRI of 71, the other a CRI of 82. The CRI 71 light source is superior both in pictures and visually. The reason for this is a light source can render one color very poorly (in this case red), and yet still receive a relatively high CRI score if most of the other colors are rendered fairly well. On the CQS (color quality scale) the ratings more accurately reflect what we see in the real world. The poorer light source, the one which had a higher CRI of 82, has a CQS rating of only 74. The better one, which only rated 71 on the CRI scale, has a CQS rating of 83. Bottom line, there are better metrics than CRI. Gamut area index in my opinion is even better than CQS because it gives the total number of colors a given light source can render, and is _independent of the CCT_. Here the results match real world results even better. More on how gamut area index compares to other metrics.


----------



## palimpsest (Aug 30, 2011)

*Re: HDS Systems EDC # 15*

A color under a 3000°K light can be perfectely accurate even with a warm hue.
It is not because a color is warmer or cooler that it is less accurate than this "same" color under neutral light.
A color isn't "accurate" under a type of lights because it doesnt have the same place in the color space. 
It is not the temperature of a type of light that makes colors accurate but the spectrum of the light. The more complete a spectrum is (but it is not the only condition), the more all colors will be accurate.
Colors under high CRI LED tend to be more accurate because high CRI LED spectrum is more complete than cool white LED.

Sorry for my bad english, i can't explain it better but do a search on "*metamerism*".


----------



## kaichu dento (Aug 30, 2011)

Brasso said:


> I love the high CRI xpg. Very soothing tint. Easy on the eyes.
> 
> To each their own.


 Not to suggest that anyone's level of understanding is not impressive, educational or anything else, but the talk of all these abbreviated terms, graphs and scientifical explanations have little bearing on a good number of us laymen who, in our own simple manner understand that we like the advances being made in the area of addressing the CRI of emitters lately. Brasso's post becomes the defacto OP of this thread and the subject is not CRI, but preference, and as Brasso, Mtn.Man, myself and others have been saying from the start is that we prefer the presentation of high CRI emitters as we have seen them so far. 

This need some of you feel to edify the ignorant masses of CPF'ers as to what proper light, CRI, CCT etc is about is great - to a point. This is a classroom after a manner, but only to the extent that we come away with what we can make use of, or are personally interested in, and it's incredible how many posts contain the same information, graphs and such explaining to us what CRI is. 

JTR1962 and Palimpsest came up with a couple of the few interesting posts on the whole matter (at least for me) in that they spoke in a relatively plain manner, as to friends. But now we have some of the most recent members speaking as if to a class they've just been asked to fill in as a replacement teacher and even though we've already been in this class years before they arrived, they want to act as if we have no idea what we're talking about and it's time to rectify the problem. 

No hesitation on the part of most CPF'ers to learn, learn and learn more, but please, if you're going to explain to us the facts of life when it comes to lights, at least assume we know the birds and bees part of it - and moreover, we know what positions we like. [Note: Allegorical references not to be taken literally.]


----------



## TwitchALot (Aug 30, 2011)

kaichu dento said:


> Not to suggest that anyone's level of understanding is not impressive, educational or anything else, but the talk of all these abbreviated terms, graphs and scientifical explanations have little bearing on a good number of us laymen who, in our own simple manner understand that we like the advances being made in the area of addressing the CRI of emitters lately. Brasso's post becomes the defacto OP of this thread and the subject is not CRI, but preference, and as Brasso, Mtn.Man, myself and others have been saying from the start is that we prefer the presentation of high CRI emitters as we have seen them so far.
> 
> This need some of you feel to edify the ignorant masses of CPF'ers as to what proper light, CRI, CCT etc is about is great - to a point. This is a classroom after a manner, but only to the extent that we come away with what we can make use of, or are personally interested in, and it's incredible how many posts contain the same information, graphs and such explaining to us what CRI is.
> 
> ...


 
I'm not interested in discussing preference at this juncture simply because I know what mine is (~2900 K and high CRI), and others know what their preferences are better than I would. That said, discussing the value of CRI as a measurement is not really about preference in the sense that discussing CRI is technical - not preferential beyond, "I like high CRI lights." But what does that mean? It seems that there is a very real discontinuity between what CRI actually is and what some members think CRI is. It's one thing to say that one prefers High CRI lights or another, but to state that CRI is indicative of an LED's absolute ability to render color is simply not true. That is not preference, but fact, and I think it's important for us to make that distinction because it's clearly lead to quite a bit of confusion. I can't speak for everyone, but I don't think anyone's preference here is right or wrong, as a preference is just that. But let's not mix up the preferential, "I like high CRI lights because they render colors better for me," with the technical (and "incorrect"), "high CRI lights render color better."


----------



## kaichu dento (Aug 30, 2011)

TwitchALot said:


> I'm not interested in discussing preference at this juncture simply because I know what mine is (~2900 K and high CRI), and others know what their preferences are better than I would. That said, discussing the value of CRI as a measurement is not really about preference in the sense that discussing CRI is technical - not preferential beyond, "I like high CRI lights." But what does that mean? It seems that there is a very real discontinuity between what CRI actually is and what some members think CRI is. It's one thing to say that one prefers High CRI lights or another, but to state that CRI is indicative of an LED's absolute ability to render color is simply not true. That is not preference, but fact, and I think it's important for us to make that distinction because it's clearly lead to quite a bit of confusion. I can't speak for everyone, but I don't think anyone's preference here is right or wrong, as a preference is just that. But let's not mix up the preferential, "I like high CRI lights because they render colors better for me," with the technical (and "incorrect"), "high CRI lights render color better."


You're posting manner wasn't really on my mind when I wrote that this morning, but I do appreciate getting an education, especially when it helps in having an idea what the conversational points mean when reading threads regarding a light I may or may not be interested in getting. I think you understand what I say though when it seems that some of the posts others have made can seem a bit much at times.

I'm still not sure where my favorite lies, but I'm sure I'd like your 2900k, and all the way up to 3200 or so, in the highest available CRI of course.


----------



## easilyled (Aug 30, 2011)

I am very pleased to see some enlightened posts questioning the value of CRI as an indicator of color rendering. 
CRI is a man-made definition and needs to be updated as technology changes, specifically led technology which has done much to highlight its shortcomings.

Most, if not all man-made definitions need to be updated with the changing times, otherwise we'd still think that the earth was flat and that the world was created in 7 days. (No doubt some may still think the latter)


----------



## bondr006 (Aug 30, 2011)

easilyled. That last statement is not appreciated and certainly amounts to nothing but baiting. The subject of personal theological or religious beliefs or lack thereof has no place in this forum.....and for good reason.


----------



## easilyled (Aug 30, 2011)

bondr006 said:


> easilyled. That last statement is not appreciated and certainly amounts to nothing but baiting. The subject of personal theological or religious beliefs or lack thereof has no place in this forum.....and for good reason.


 
Not intended as baiting. Science has disproved this and this is my only point.


----------



## bondr006 (Aug 30, 2011)

easilyled said:


> Not intended as baiting. Science has disproved this and this is my only point.



This is only your opinion and one that does not belong here.


----------



## easilyled (Aug 30, 2011)

bondr006 said:


> This is only your opinion and one that does not belong here.


 
I don't believe that blind faith in something should obscure facts.
This is precisely my point because, CRI is sometimes used in blind faith as something that is unalterable, just like other examples.


----------



## DM51 (Aug 30, 2011)

Back on topic please... :ironic:


----------



## bondr006 (Aug 30, 2011)

removed...


----------



## TwitchALot (Aug 30, 2011)

kaichu dento said:


> You're posting manner wasn't really on my mind when I wrote that this morning, but I do appreciate getting an education, especially when it helps in having an idea what the conversational points mean when reading threads regarding a light I may or may not be interested in getting. I think you understand what I say though when it seems that some of the posts others have made can seem a bit much at times.
> 
> I'm still not sure where my favorite lies, but I'm sure I'd like your 2900k, and all the way up to 3200 or so, in the highest available CRI of course.


 
I'm not bothered by technical discussions being a technical person, but I am obviously aware that some/most people aren't interested in the details as long as they have the punchline. That said, I think it's important for CRI to be clarified and for users to understand what it actually is, because it's somewhat of a misnomer and has obviously created confusion.


----------



## kaichu dento (Aug 31, 2011)

easilyled said:


> I am very pleased to see some enlightened posts questioning the value of CRI as an indicator of color rendering.
> CRI is a man-made definition and needs to be updated as technology changes, specifically led technology which has done much to highlight its shortcomings.
> 
> Most, if not all man-made definitions need to be updated with the changing times, otherwise we'd still think that the earth was flat and that the world was created in 7 days. (No doubt some may still think the latter)


The 7 day thing is allegorical and the only thing that can be sure is that the light at day one was most definitely high CRI.


easilyled said:


> Not intended as baiting. Science has disproved this and this is my only point.


Can you please delete this so that we don't have to let this stand unanswered.


easilyled said:


> I don't believe that blind faith in something should obscure facts.
> This is precisely my point because, CRI is sometimes used in blind faith as something that is unalterable, just like other examples.


Nice point, but much more forum friendly without the sharp stick you presented it with. Blind faith goes in many directions, but again, that's a feast for another thread.


TwitchALot said:


> I'm not bothered by technical discussions being a technical person, but I am obviously aware that some/most people aren't interested in the details as long as they have the punchline. That said, I think it's important for CRI to be clarified and for users to understand what it actually is, because it's somewhat of a misnomer and has obviously created confusion.


There was a guy who sold a light last year with some yellow paint on the lens and labeled it as having CRI! 

I'm definitely not anti-education, it's just when it gets to the point that the posting has little to no bearing on the OP and turns in a technical data thread with numerous posters coming up with the same information over and over. By all means, keep informing us of that which we need to know in order to make properly informed decisions. It's getting clearer to me slowly but surely, but on the other hand, I am aware already warm/cool is only connected to CRI and not defined by it.


----------



## the.Mtn.Man (Aug 31, 2011)

*Re: HDS Systems EDC # 15*



TwitchALot said:


> CRI is a only a measure of how well your artificial light source imitates a reference standard (depending on CCT). What if your reference light source distorts certain colors? Do we want to _perfectly_ mimic an imperfect color rendering light source?



Exactly. It's about preference and not necessarily "accurate" color rendering. To that end CRI is a useful reference if you understand what it's really measuring.


----------



## blasterman (Aug 31, 2011)

While I have a solid understanding of CRI, reading through the posts here and noting the general partisianship towards certain flavors of emitters I'm not sure if CRI is all that relevant. Most of the bickering involves complaints about >5000k CCT emitters and the goofy color casts that result. 

Question: are there any main stream torches out there using >5000k CCT - 85 CRI emitters? How about even 80CRI? Typically you don't hit 80 CRI until around 4000K, and while there are >4000k emitters out there with CRI in the mid to high 80's they are far from the norm. Virtually all above 5000K are 70-75 CRI, correct?

We've all noticed that CRI increases as CCT gets lower in LEDs. The reason for this is that amber is the main 'swing spectrum' in terms of LED CRI, and it's also the least efficient color to generate. Warmer LEDs by nature have a higher amber component, and with the higher amber component green and red can be balanced better and hence CRI rises. Contrary to grade-school science, all colors of the rainbow cannot be reproduced with any degree of precision using just RGB. As to why we don't have 85 CRI, 6000K emitters is something I haven't put my skull around, but it may be do to the stimulation levels that are required by secondary phosphors compared to initial blue.

The exception to the industry trend are the recent 'ANSI' spec emitters made in both Cree and Rebel varities which sacrifice CRI (Amber) for greater lumen values. Having tested a few of the low CRI ANSI spec emitters I can tell you the color rendition of them is terrible compared to typical CRI emitters. However, the 'ANSI' spec emitters are geared towards interior lighting and I can't see a reputed torch maker even dream of using them.

The short arguement is I think we're disagreeing on simply CCT values and those odd tints that show up from time to time at the edges of the bin envelope chart. The difference from 80-85 or 85-90 CRI in my book is otherwise not visible.


----------



## TwitchALot (Aug 31, 2011)

*Re: HDS Systems EDC # 15*



the.Mtn.Man said:


> Exactly. It's about preference and not necessarily "accurate" color rendering. To that end CRI is a useful reference if you understand what it's really measuring.


 
But preference of tint was is the issue - color rendering and the value of CRI is. And in that sense, CRI as a measurement has deep flaws. Again, I don't think it's useless, but 80 CRI can objectively render colors better than a 90 CRI emitter. So to say that 90 CRI renders colors better than an 80 CRI emitter isn't necessarily true. If tint is your concern, then CRI doesn't have much bearing on the matter. In terms of CCT/tint, I find CRI to be almost meaningless, at least within a small spread. 



blasterman said:


> Question: are there any main stream torches out there using >5000k CCT - 85 CRI emitters? How about even 80CRI? Typically you don't hit 80 CRI until around 4000K, and while there are >4000k emitters out there with CRI in the mid to high 80's they are far from the norm. Virtually all above 5000K are 70-75 CRI, correct?



As far as I know, no.



> The short arguement is I think we're disagreeing on simply CCT values and those odd tints that show up from time to time at the edges of the bin envelope chart. The difference from 80-85 or 85-90 CRI in my book is otherwise not visible.


In my experience with Cree emitters, I honestly can't tell the difference between an 80 CRI emitter and a 90 CRI emitter at a similar CCT (it is possible that I haven't found the right color scheme that would highlight the difference in spectral output, or that it just simply isn't really noticeable with such a small difference). But to mistake a preference for a certain CCT and accurate color rendering and high CRI is due to a technical misunderstanding that I think is important to clear up to some extent, at least.


----------



## jtr1962 (Aug 31, 2011)

blasterman said:


> While I have a solid understanding of CRI, reading through the posts here and noting the general partisianship towards certain flavors of emitters I'm not sure if CRI is all that relevant. Most of the bickering involves complaints about >5000k CCT emitters and the goofy color casts that result.
> 
> Question: are there any main stream torches out there using >5000k CCT - 85 CRI emitters? How about even 80CRI? Typically you don't hit 80 CRI until around 4000K, and while there are >4000k emitters out there with CRI in the mid to high 80's they are far from the norm. Virtually all above 5000K are 70-75 CRI, correct?


Probably high 70s is where typical non-high CRI cooler emitters fall. Some manufacturers may say 70 to 75 in the spec sheet, but those are minimum values. Usually when you see independent tests you'll find CRI around 78 or so. Cree seems to be slightly better than Lumileds, both on paper and visually. There aren't any technical issues making high CCT, high CRI emitters. You mostly need a secondary phosphor to fill in the cyan valley, and another to increase the amount of deep red. And high CCT, high CRI emitters won't suffer as much of an efficiency hit as their low CCT, high CRI cousins. Why they're not more common is beyond me. It's funny how in the fluorescent world most of the high CRI stuff tends to be around 5000K to 6000K, whereas in the LED world you're lucky to get 3500K in high CRI.

On another note, a CRI 78, 5000K LED looks worlds better than a CRI 85, 5000K triphosphor fluorescent. Maybe that's why high-CRI, high CCT emitters aren't made. Practically speaking, so long as they fall near the Planckian Locus, it might be difficult to tell them apart from standard CRI emitters. That's my theory anyway. I'd love to see one in person just to make the comparison.



> The short arguement is I think we're disagreeing on simply CCT values and those odd tints that show up from time to time at the edges of the bin envelope chart. The difference from 80-85 or 85-90 CRI in my book is otherwise not visible.


Indeed, once you hit a CRI in the low 80s, the majority of people will not be able to distinguish it from a light source with a higher CRI. The differences between CRI 85 and CRI 93, for example, are subtle at best. The key is to make sure that no colors are rendered very poorly, even if none are rendered perfectly.


----------



## blasterman (Aug 31, 2011)

Agreed.

I'll also add my experience a few summers back watching a pair of cops case a parking lot and one was carrying a very large, high powered LED torch with a typical low CRI, psuedo purple-ish 7000-6500k LED while the other a typical incan. I could hear each cop radio in vehicle descriptions and color, and is was disturbingly funny hearing the same car described different colors based on the light they were using.


----------



## TyJo (Aug 31, 2011)

It seems ANSI has solved the problems with lumen ratings. Right now we have CRI for.... I don't even know at this point. What I do want to know from those unhappy with "CRI" or how it is referenced on CPF... Is there a scale that exists today that LED manufactures can use other then CRI, that can accurately measure how well an LED accurately represents color? I have seen the other metrics/scales/measurement methods above, but are those directly applicable and practical for LEDs? If so, why don't manufacturers use these more accurate measurements?


----------



## Shooter21 (Sep 1, 2011)

where can i get a production high CRI that closest to a incan color temp?


----------



## jtr1962 (Sep 2, 2011)

Shooter21 said:


> where can i get a production high CRI that closest to a incan color temp?


I think the high-CRI XP-G is pretty good. They're available from Cutter. XPGWHT-U1-0000-00AF8 is 90+ CRI and 2900K. XPGWHT-U1-0000-00AE7 is 90+ CRI and 3000K. I have some of the 3000K. To me they look pretty close to a halogen incandescent.


----------



## Shooter21 (Sep 2, 2011)

jtr1962 said:


> I think the high-CRI XP-G is pretty good. They're available from Cutter. XPGWHT-U1-0000-00AF8 is 90+ CRI and 2900K. XPGWHT-U1-0000-00AE7 is 90+ CRI and 3000K. I have some of the 3000K. To me they look pretty close to a halogen incandescent.


 does anyone make torches with that emmiter?


----------



## mvyrmnd (Sep 3, 2011)

Shooter21 said:


> does anyone make torches with that emmiter?


 
The Armytek Predator is available with that. There might be a couple of others. Otherwise you need to go custom.


----------



## Max_Power (Sep 3, 2011)

I have some of the Malkoff "warm" (actually neutral) XPG emitters and they are nearly as efficient / bright as the cool white versions. The High-CRI XPG has a low color temperature (~3000K) so the tint looks like an incandescent light. At first I had buyer's remorse, but once I started using it to illuminate my flower garden (and people) I began to appreciate the richness of the browns and reds. It only seems like it is dim because of the orange tint, but it is actually very bright and easy on the eyes. In order to fully appreciate it, you have to use it without any sunlight "contamination" because the beam tint is similar to sunrise/sunset. That's not necessarily a bad thing - I prefer to do photography during "the magic hour" at sunrise and sunset because of the warm, yellow-orange light. It doesn't make a white wall look white though. That's what the more neutral tints are good for (MC-E Warm, M60W, M61W emitters.) 

XPG emitters in a D26 reflector don't have a lot of throw, but they are wonderful for lighting up a large room or the whole path ahead, including stuff on the sides and above. Combine that with the improved browns, greens, and reds from the HCRI and neutral-warm LEDs, and I am one happy flashaholic these days.

I am still tempted to buy another LED light if it isn't ridiculously expensive and can beat the color rendering of the neutral and warm XPG. I'd get another M60W if they were still available. Whatever was used in the RA HCRI is also beautiful but I dislike the relatively low output, high price (yes it is a technical masterpiece), and the complex user interface.

Last night something went bump in my backyard and I grabbed my trusty AR15 to investigate. I had forgotten that the light on it was using a standard Malkoff M60 cool-white. Although the tint was noticeably bluer than my current standard, it was only a little bit bluer than "white" and had great beam shape for weaponlight duty (100+ yard throw with just-right spill.) I guess it is a case of "the right tool for the right job."


----------



## Houdiny (Sep 4, 2011)

Since I thought that this thread was useless without pics (insert smily here...), I took some of my lights and a color chart that came with a water testing set for my tank. It is absolutely necessary to be able to differentiat between those colors, so I put my lights to the test.
The images could be sharper, but I wanted to go with a large aperture in order to not having to set the ISO any higher. The Camera's (Canon 7d) white balance was set on daylight, 5700K. The lights were ceiling bounced.

The contestants:





From left to right: Ti Quark Mini 123 S3, Ti Quark 123x2 XPG R5 Cool White, HDS EDC Rotary XPG R5 Cool White, Liteflux LF3XT (which I consider to be very nice tinted), Quark AAx2 with XPG Q2 High CRI 3000K and a Surefire M3 incandescent (with a CRI of 100, I guess.)

First, a daylight reference shot (not ceiling bounced, of course...):





Next, the Ti Quark Mini 123 S3:





Ti Quark 123x2 XPG R5 Cool White:





HDS EDC Rotary XPG R5 Cool White:





Liteflux LF3XT:





Quark AAx2 with XPG Q2 High CRI:





Surefire M3:


----------



## TyJo (Sep 4, 2011)

Nice post Houdiny. I've don't own any titanium lights but those titanium 4sevens look good.


----------



## the.Mtn.Man (Sep 5, 2011)

TyJo said:


> Is there a scale that exists today that LED manufactures can use other then CRI, that can accurately measure how well an LED accurately represents color?


Again, it depends on what you mean by "accurate". CRI ratings seem to do their job well enough when talking about a warmer source (4000k and under) but seems to fall down at higher color temperatures.


----------



## the.Mtn.Man (Sep 5, 2011)

Houdiny said:


> The Camera's (Canon 7d) white balance was set on daylight, 5700K.


 
I'll note again the flaw in these kinds of comparisons is that emitters that are further away from the reference will look comparatively "worse". That is to say that if you set the white balance for 3200K, the result would be that cooler color temperatures would look overly blue and unappealing.

That said, I'm surprised to see that at least as far as the camera is concerned, there's no difference in color rendering between sunlight and the HDS Rotary with its cool white XPG emitter. Does it really look that good in person? Any idea what the CRI is on that?

With the high CRI Rotaries perpetually one month away (seriously, Henry has been saying they're a month away since June), I'm starting to toy with the idea of getting a cool white Rotary despite my love of warm white high CRI.


----------



## uk_caver (Sep 5, 2011)

*Re: HDS Systems EDC # 15*



TwitchALot said:


> A "visual industry professional" would be better off with a computer program that can take a spectral distribution and output the ability of said distribution to accurately render _every color_ compared to an ideal light source for any CCT. That is, you plug in the distribution, the CCT you want to compare it to (and maybe the light source to be used as a reference, so you'd need a database for that), and it spits out: Magenta: 90% accuracy, Cyan: 85% accuracy, Pink: 40% accuracy, etc for every conceivably useful color. But is this really practical or useful from a consumer standpoint? Does this mean the current CRI is a *useless* measure? I don't believe so - on either count.


When it comes to 'conceivably useful colors', presumably that's somewhat subjective, or at least dependent on biology and environment?

In areas of the spectrum where colours look pretty much the same over a relatively wide range of wavelengths (biology), if the things we tend to look at (environment) don't have a particularly peaky spectrum when lit buy uniform coloured light, then actual colour rendering would presumably be basically the same for a light that had a flat output and one that had a spectrally rough one?

Likewise, if there are areas of the spectrum where the eye is more colour-discriminating, depending on the objects in the environment it could presumably still be possible to have a light that looked good even if it had a trough in the spectrum - for example, an LED light with a relative lack in the cyan area might work pretty well at lighting up cyan objects if real-world cyan objects looked cyan largely because they reflected/emitted decent amounts of bluer and greener light rather than just being peaky in the cyan area of the spectrum?

Presumably, a lot also depends on whether a light source is just one of many, or the only type around - a light source lacking at the blue end of the spectrum may well look much less biased than it actually is if it's the sole source of light, due to the eye adapting to the light and effectively dialling up its sensitivity to blue via chemical changes in the retina (and possibly in downstream processing as well)?

How would one rate a light source that looked effectively perfect with almost all objects, but made the odd object which used very spectrally spiky pigments look wrong?

I dare say it would be possible to make colour charts which looked fine in daylight/black body light but pretty wrong in more 'peaky' light, and also to make colour charts which looked pretty similar in both.

I wonder if it would be practical to have test charts that really reflected real-world colours both in their actual colour and their overall spectra, or maybe test charts that had one or more patches of the 'same' colour achieved in different ways, some 'flatter' than others?


----------



## Houdiny (Sep 5, 2011)

the.Mtn.Man said:


> I'll note again the flaw in these kinds of comparisons is that emitters that are further away from the reference will look comparatively "worse". That is to say that if you set the white balance for 3200K, the result would be that cooler color temperatures would look overly blue and unappealing.
> 
> That said, I'm surprised to see that at least as far as the camera is concerned, there's no difference in color rendering between sunlight and the HDS Rotary with its cool white XPG emitter. Does it really look that good in person? Any idea what the CRI is on that?
> 
> With the high CRI Rotaries perceptually one month away (seriously, Henry has been saying they're a month away since June), I'm starting to toy with the idea of getting a cool white Rotary despite my love of warm white high CRI.


 
You're probably right regarding the color balance of the camera. I was thinking about setting it to AWB, but figured that would mess with the results as well. 
My Rotary's CCT is definately on the cold side, but still renders colors quite beautifully. E.g. shining a cold white light onto your hand often leads to a very unpleasing tone of your skin, but with my Rotary the skin's color is just fine.
I too was thinking about waiting for the HCRI Rotary but then I got the normal instead, and I'm absolutely pleased with it, especially when seeing how warm the HCRI Xpg of my Quark is.


----------



## the.Mtn.Man (Sep 5, 2011)

Houdiny said:


> My Rotary's CCT is definately on the cold side, but still renders colors quite beautifully. E.g. shining a cold white light onto your hand often leads to a very unpleasing tone of your skin, but with my Rotary the skin's color is just fine.


That certainly makes it tempting. The reason I started to shy away from cool white is because of how overly blue it made everything look, but perhaps manufacturers have been quietly improving color rendering of their cool white emitters.


----------



## DM51 (Sep 5, 2011)

It seems to me there are a whole lot of people mincing around the real issue here, which is that this is a LED vs. Incan thread in disguise. It's a boring and sterile discussion, with the pro-Incan people pushing the benefits of high CRI, and the pro-LED people saying CRI is irrelevant or unimportant.

I'm already regretting I split the posts off from the HDS thread and let the discussion continue here. I should have deleted the whole lot. And that is pretty much what will happen, unless I see less BS and partisan point-scoring going on here. It's a *BORING THREAD*, with a lot of long-winded and ill-informed drivel from people who have scant understanding of the topic. 

It will need to improve, or it will be closed.


----------



## Houdiny (Sep 5, 2011)

Oh boy, someone's having a bad day?! I don't find this thread particularly boring... Let's just see if someone has anything else to contribute that hasn't already been said. 
Summing up, tints are very subjective and can influence the CRI, even if a emitter has got a high CRI, a emitter with a lower CRI can subjectively be better at rendering colors because of having a higher CCT.


----------



## the.Mtn.Man (Sep 5, 2011)

I don't see how there has been anything objectionable posted here. As far as I can tell, participants are being polite, and I personally have found the discussion interesting.

But if a moderator feels he just has to close it, I guess that's his prerogative. :shrug:


----------



## DM51 (Sep 5, 2011)

the.Mtn.Man said:


> I don't see how there has been anything objectionable posted here. As far as I can tell, participants are being polite, and I personally have found the discussion interesting.
> 
> But if a moderator feels he just has to close it, I guess that's his prerogative. :shrug:


Nothing particularly objectionable has been posted, if you overlook the wheedling and back-handed point-scoring BS that has gone on, or if you don't mind being bored to death (which, personally, I DO mind). People are being polite, or at least they are pretending to be. But there's no getting away from the fact that as I mentioned above, *this is a LED vs. Incan thread in disguise.*

I'm not closing it quite yet. In plain English, that means I will close it just as soon as someone gives me sufficient reason to do so.


----------



## uk_caver (Sep 5, 2011)

Maybe as a more occasional visitor than many, and as someone who only tends to read certain sections (headlights, electronics, LED), I'm rather less likely to see parallels with other threads that might have bored people than members who spend much more time here than I do, especially people who may, as part of their moderator duties, be _obliged_ to read any number of threads which get contentious or which are repeating old arguments.
My 'that's interesting' could very easily be someone else's 'oh no, not again', and I guess if they're much better informed and involved than I am, if anyone is right when it comes to a thread's value to the site as a whole, it's probably not going to be me.

From my position of relative ignorance in that respect, the thread does seem interesting even if it started off maybe a touch on the warm side.
Regarding the LED-vs.-incan stuff, which I do personally find dull when I encounter it in general, that's not really how I'd read the thrust of things, though I can often be blind to a great deal if I'm reading from a purely technical perspective.


----------



## easilyled (Sep 5, 2011)

Personally, I think its informative to know that a lower CRI can sometimes render color better than a higher CRI, depending on other factors like CCT. 

In my opinion, highlighting this limitation in CRI is not due to having an anti-incandescent agenda but just for the sake of improving our collective understanding about the subject.


----------



## TyJo (Sep 6, 2011)

Deleted.


----------



## TwitchALot (Sep 8, 2011)

DM51 said:


> It seems to me there are a whole lot of people mincing around the real issue here, which is that this is a LED vs. Incan thread in disguise. It's a boring and sterile discussion, with the pro-Incan people pushing the benefits of high CRI, and the pro-LED people saying CRI is irrelevant or unimportant.
> 
> I'm already regretting I split the posts off from the HDS thread and let the discussion continue here. I should have deleted the whole lot. And that is pretty much what will happen, unless I see less BS and partisan point-scoring going on here. It's a *BORING THREAD*, with a lot of long-winded and ill-informed drivel from people who have scant understanding of the topic.
> 
> ...





uk_caver said:


> Maybe as a more occasional visitor than many, and as someone who only tends to read certain sections (headlights, electronics, LED), I'm rather less likely to see parallels with other threads that might have bored people than members who spend much more time here than I do, especially people who may, as part of their moderator duties, be _obliged_ to read any number of threads which get contentious or which are repeating old arguments.
> My 'that's interesting' could very easily be someone else's 'oh no, not again', and I guess if they're much better informed and involved than I am, if anyone is right when it comes to a thread's value to the site as a whole, it's probably not going to be me.
> 
> From my position of relative ignorance in that respect, the thread does seem interesting even if it started off maybe a touch on the warm side.
> Regarding the LED-vs.-incan stuff, which I do personally find dull when I encounter it in general, that's not really how I'd read the thrust of things, though I can often be blind to a great deal if I'm reading from a purely technical perspective.


 
I'm with UK Caver on this, and I think this is why most of us are not seeing what you are and we're so surprised you brought up the Incan vs LED disguise. This discussion about CRI has been very technical; It's not about LED vs Incan although they demonstrate noticeably different spectra, although I can see how a couple of parts of posts brought up the issue. I can understand why some people would not find technical threads interesting, but please don't assume no one does or that it is worth closing simply because you don't find technical threads interesting, particularly because the technical side of things have already been thoroughly covered already.


----------



## Sparky's Magic (Sep 8, 2011)

Great thread, however for the average 'dog walker' (like me), the answer is much more simple and I'm with Lillian on this one. Whatever looks the best (to you) is the best for you. I'm not looking at specs. when on the bush track, I'm looking where I'm going. :thumbsup:


----------



## ElectronGuru (Sep 9, 2011)

*Re: HDS Systems EDC # 15*

Light source color tends to be one of the more [surprisingly] technical subjects on CPF.

Much of the discussion here has so far centered around the question of the ideal tint. While important, it isn't a concern unique to high CRI LEDs. And as there is no right answer, its also a source of consternation.

What sets high CRI LEDs apart, is not their central (ideal) tint, but the breadth of their color reveal. Thinking of central tint as 0, while a low CRI LED can show colors -1 0 1, a high CRI LED show more colors, more clearly, further from 0, more like -4 -3 -2 -1 0 1 2 3 4. 

The phosphors applied to the high CRI LED are attempting to more closely mimic light sources (natural and man made) that provide this breadth automatically/naturally. The CRI scale is one method of measuring the success of this effort - to develop LEDs past their narrow color range nature.


----------



## TwitchALot (Sep 14, 2011)

*Re: HDS Systems EDC # 15*



uk_caver said:


> When it comes to 'conceivably useful colors', presumably that's somewhat subjective, or at least dependent on biology and environment?



Well sure - the lightning needs of one may not match the lightning needs of another. 



> In areas of the spectrum where colours look pretty much the same over a relatively wide range of wavelengths (biology), if the things we tend to look at (environment) don't have a particularly peaky spectrum when lit buy uniform coloured light, then actual colour rendering would presumably be basically the same for a light that had a flat output and one that had a spectrally rough one?



I disagree that things look the same over a wide range of wavelengths in biology, but maybe we're thinking about different things. I'm not sure I understand your question, either. The light source will emit some spectrum, so if you're talking about the sun being the environment, the sun emits a fairly smooth spectrum. How the color is perceived depends not only on us and the light source, however, but also how the object absorbs/reflects/emits/responds to light. 



> Likewise, if there are areas of the spectrum where the eye is more colour-discriminating, depending on the objects in the environment it could presumably still be possible to have a light that looked good even if it had a trough in the spectrum - for example, an LED light with a relative lack in the cyan area might work pretty well at lighting up cyan objects if real-world cyan objects looked cyan largely because they reflected/emitted decent amounts of bluer and greener light rather than just being peaky in the cyan area of the spectrum?



Right.



> Presumably, a lot also depends on whether a light source is just one of many, or the only type around - a light source lacking at the blue end of the spectrum may well look much less biased than it actually is if it's the sole source of light, due to the eye adapting to the light and effectively dialling up its sensitivity to blue via chemical changes in the retina (and possibly in downstream processing as well)?



Agreed.



> How would one rate a light source that looked effectively perfect with almost all objects, but made the odd object which used very spectrally spiky pigments look wrong?



I'm not sure what you're referring to, but it depends on how we develop the standards and what it entails. Subjectively, if one strange object looked funny under that light, odds are, I wouldn't even care. How often am I going to run into this weird object anyway?



> I dare say it would be possible to make colour charts which looked fine in daylight/black body light but pretty wrong in more 'peaky' light, and also to make colour charts which looked pretty similar in both. I wonder if it would be practical to have test charts that really reflected real-world colours both in their actual colour and their overall spectra, or maybe test charts that had one or more patches of the 'same' colour achieved in different ways, some 'flatter' than others?



I'm not sure if this could be accomplished, as I'm not sure how the color charts are made or the properties of the inks. But I suppose it's possible, at least in theory.


----------



## TwitchALot (Sep 14, 2011)

Sparky's Magic said:


> Great thread, however for the average 'dog walker' (like me), the answer is much more simple and I'm with Lillian on this one. Whatever looks the best (to you) is the best for you. I'm not looking at specs. when on the bush track, I'm looking where I'm going. :thumbsup:


 
I agree that the preference is subjective. But short of buying the light and seeing it in person first, we need a way to measure what a light will objectively look like to make it easier for consumers to know what to expect beforehand. CRI is useful if you understand that it does not measure color rendering, as the name implies, but similarity to a specific reference source. I know that I like 3000 K incandescent light, for example. So for me, a CRI rating is useful, since it tells me approximately how well an LED compares to an incandescent in terms of the way colors look under that light. But if someone has no idea what color temperature they prefer and want something that looks as close to daylight as possible, or want something that renders a specific scene in a certain way (film?), CRI is a lot less useful as a measurement.


----------



## uk_caver (Sep 14, 2011)

*Re: HDS Systems EDC # 15*



TwitchALot said:


> I disagree that things look the same over a wide range of wavelengths in biology, but maybe we're thinking about different things.


Yes - I was trying (probably too concisely) to refer to the fact that our [biological] colour receptors have broad sensitivity, and so a spiky LED output might not lead to inaccurate colours being seen if the absorption (or non-absorption?) spectra of objects in the environment are fairly smooth, as they often seem to be for many things (Including, it seems the references which CRI is based on.)

In general, maybe it'd be useful to have different colour accuracy measures for different people - a professional doing colour-sensitive work may be most interested in having the smallest maximum deviation for a test colour, whereas for domestic or other industrial use, a different measure might be useful, whether because it ties in more with the overall subjective feel of a particular light's spectrum, or because it was rather more economic to make some particular compromise which sacrificed accuracy in places where it might be rarely missed.

I suppose that in the future, at least the technology may start to be more widely available to allow people to choose colour and maybe CRI of lighting on the fly, so potentially we might have the opportunity (stick dataloggers in a decent number of colour-adjustable fixtures?) to start working out what kinds of light people actually *do* like in ways that we can't easily do currently except in artificial test situations.

_That_ information might end up being rather more useful than any artificial measure.


----------



## Anders Hoveland (Apr 17, 2015)

Brasso said:


> I love the high CRI xpg. Very soothing tint. Easy on the eyes.


I too have noticed high CRI LEDs tend to be a little easier on the eyes. I think it is because if there is more cyan in the spectrum, there does not need to be quite as much blue allowed to get through.
Also, some of the latest high CRI emitters are utilizing blue-green phosphors, and longer blue wavelengths tend to be a bit less harsh than shorter ones.

This is a completely different issue than color tint though.


----------



## SemiMan (Apr 18, 2015)

Anders Hoveland said:


> I too have noticed high CRI LEDs tend to be a little easier on the eyes. I think it is because if there is more cyan in the spectrum, there does not need to be quite as much blue allowed to get through.
> Also, some of the latest high CRI emitters are utilizing blue-green phosphors, and longer blue wavelengths tend to be a bit less harsh than shorter ones.
> 
> This is a completely different issue than color tint though.



Notice last post date ...

"09-14-2011, 05:37 PM #79
uk_caver "


----------



## easilyled (Apr 18, 2015)

SemiMan said:


> Notice last post date ...
> 
> "09-14-2011, 05:37 PM #79
> uk_caver "



Funny you should say that. I noticed it too and then wondered about what agendas a person desperate to resurrect such an old thread might have.


----------



## Shooter21 (Apr 23, 2015)

I EDC both a Haiku with the Nichia 119 and a HDS with a HCRI XP-G since I can't decide which tint i like better.


----------

