High-CRI is no good for video lights? Really?

Mooreshire

Newly Enlightened
Joined
Jul 22, 2011
Messages
156
Location
Seatte, WA
I have heard it stated that there are folks here on the forum who appreciate the high-CRI diodes for photography and video lighting applications. Makes perfect sense to me!

I'm wondering - are there any side-by-side shots depicting the benefits?

If the benefits are noticeable enough, I'll clearly have to retrofit all my video lights. I use 3CCD (triple color chip) video cameras so they are quite color sensitive.

Unfortunately I've just read this in the wikipedia article about CRI:

"Problems have been encountered attempting to use otherwise high CRI LED lighting on film and video sets. The color spectra of LED lighting primary colors does not match the expected color wavelength bandpasses of film emulsions and digital sensors. As a result, color rendition can be completely unpredictable in optical prints, transfers to digital media from film (DI's), and video camera recordings. This phenomenon with respect to motion picture film has been documented in an LED lighting evaluation series of tests produced by the Academy of Motion Pictures Arts and Sciences (AMPAS) scientific staff"

The citation following the statement did not lead me directly to the information it claimed to, and I'm wading through the AMPAS's "Solid State Lighting Project" site in search for specific confirmation - might be somewhere inside a video lecture series or a PDF tech-report but I can't find easy confirmation nor any photo-documentation of the incompatibility.

So what do folks think? Stick with my many Cool White XP-Gs?

I'll surely buy some Nichia 219s and make my own side-by-side analysis, but I'm wondering if anyone has beat me to it already?
 

bshanahan14rulz

Flashlight Enthusiast
Joined
Jan 29, 2009
Messages
2,819
Location
Tennessee
To me it sounds like they're saying you're better off with a red, green, and blue LED with wavelengths centered around their definition of red, green, and blue.

I doubt that they've tested any new LEDs, a series of tests takes a long time for them, where it would be one night of taking comparison photos for us hobbyists.

I'll tack on this suffix here: I don't know jack squat about photography, and I have the worst eye for color, ear for music, and tongue for taste. But it sounds to me like if you are making a big video production, might as well stick to the several thousand watt halogens or whatever they use for "white" light.
 

Mooreshire

Newly Enlightened
Joined
Jul 22, 2011
Messages
156
Location
Seatte, WA
Ah, the citation did indeed pertain to a video symposium series and I had to find and watch the series. It was interesting but downsampled so badly that one couldn't really see what they were talking about.

There was clearly no problem lighting their test scenes.

Seemed moderately outdated, and the criticism was all personal opinion and wanted to relate LEDs claiming good CRI ratings to tungsten etc. The criticisms seemed valid, but pertained more to how a manufacturer can simply pander to the sensory equipment used to test the few bands that the Index checks. I'll be using LEDs unconditionally because I'm on battery power and need to keep things small. I'd be more interested in hearing about better tests between diode models and figuring the best for a certain sensor arrangement.
 
Last edited:

mahoney

Enlightened
Joined
Jan 7, 2002
Messages
603
The simple answer is that the human eye responds to light a bit differently than a camera does. And digital cameras respond differently than film does.

Initially the CRI test involved humans looking at a test card under a known full spectrum light source (usually high color temperature incandescent) and then under the light source being tested and rating the results on a scale of 100...pretty subjective, especially because the test was performed by the manufacturer of the light source...the test is becoming more formalized and scientific now but it's not perfect yet.

And it's not so simple as just using a "high" CRI light source. Say you have a CRI 90 light source that has an emission spectrum slightly low in a particular wavelength of green light, a wavelength that your camera is particularly sensitive to or that is the wavelength that the green paint on your set reflects particularly well...and your set was painted under sunlight, and you approved the color under sunlight. Even under the light source your eyes may not tell you there's a problem. You can see the problem only when looking at the monitor or the film.

The best you can do is look at the emission spectra of the emitters you want to use and the sensitivity profiles of the cameras you use and try to get a good match. Even then there may be an unexpected problem, after all while incandescent light is a full spectrum source, compared to sunlight it is deficient in blue. Hence all the money spent on "color correction filters" in the film industry. You have to pick a light source/color temperature and stick with it throughout the film, or digitally correct any scenes lit by a different light source. Presumably some manufacturers have done this research into the proper emitters, there are lots of LED based film lights out there and I can't imagine anyone would keep using them if they were causing color problems, movie making is too expensive to muck about with stuff that makes problems.
 

briaowolf

Newly Enlightened
Joined
Dec 4, 2015
Messages
7
I know this is an old thread, but it's right in line of what I'm looking into. I'd love a high CRI LED flashlight to use in a video based project. Does the color change as the battery wears down? Any other color changes I should be aware of as I shoot with them?

Thanks!
 
Top