# Human retina has UV-sensitive photoreceptors



## jtr1962 (Aug 28, 2004)

Here is a very interesting article which explains quite a few things. Apparently, the eye contains four types of photoreceptors rather than three. The fourth one is sensitive to light from 300 nm to 400 nm. The main reason most people can't see light under about 400 nm is because the lens absorbs most of this light. However, those with thinner lenses can see further into the small wavelength region than most people. Now I know why I find full sunlight very disconcerting, and also why UV LEDs seem much brighter to me than they should. This may also explain my strong preference for high-color temperature light (I respond more to short wavelengths than most people, so I notice it more when they're missing).


----------



## Ken_McE (Aug 28, 2004)

Hi, This Fulton guy has an interesting theory, and he makes serious looking web pages, but I don't believe his theories are widely accepted. This doesn't mean he's full of *&&!!, just that you should approach his work critically, like any other passing web site.

There *might* be human tetrachromats, but the extra cone should be somewhere in the red to green range, not further into the UV. *If* human tetrachromats exist they are vanishingly rare.

Humans can see into the UV, but you don't need a whole new sensor to do this with. The cones that see blue will work into the near UV also. I don't know if this is a design feature. The lense of your eye filters out UV on purpose. UV tends to damage living cells, including the ones that make up the inside of your eyes. Short lived animals like shrimp and deer and insects can afford to use UV for vision, they'll be dead before the UV burns out their sight.

You may see further into the UV than most people. If so, you should use this ability sparingly. Long lived animals, like you, need to be careful about that stuff because you could easily live long enough for it to catch up with you. I would suggest good sunglasses that block UV as a routine accessory on sunny days. Be particularly careful if you are out on the water or in snowfields when it is sunny.

The web has essentially nothing about what it is like for a person to see UV. I would be curious to hear more about this if you should care to post about it.

Regards, [email protected]


----------



## jtr1962 (Aug 29, 2004)

Since I have no point of reference for what I "should" see it's very difficult for me to describe. Probably the biggest difference would be in how I perceive light sources. Remember that UV causes many materials to fluoresce so the fluoresced color overwhelms how the object would appear if it simply reflected the UV. These objects therefore appear no different than they would to a person with normal vision. Even when an object doesn't fluoresce it often only reflects light within a small bandwidth, so here again it's color will not appear noticeably different. However, with light sources I should definitely perceive any UV in theory. In practice I don't because the glass used to make bulbs and fluorescent tubes blocks most of the UV. I think the main thing I see differently is that I perceive blues more strongly than most, or maybe I perceive the UV as blue. Regardless, I'm very sensitive to when a light source is deficient in blue. This whole phenomenom may also explain why I prefer fluorescent lighting. A small amount of UV from the mercury arc does in fact make it through the tube. Maybe this makes the light closer to sunlight for me. I can go out in full sun but don't feel comfortable staying out for any length of time. Sorry I can't be much more specific than that. Also, I think I only have a fairly slight version of UV sensitivity, but enough for me to somehow perceive light sources differently. I think the only person who could describe the phenomenon accurately would be someone who had their lenses removed because of cataracts. They would have a point of reference for their vision before the condition. I don't.


----------



## Frangible (Sep 7, 2004)

Sorry but this isn't very convincing. He merely shows a weak response to some UV, but the wavelength response of all human eyes is not the same, and there is some bleedover into both UV and near-IR.

Further, the UV light source itself has some frequency bleeding... did he account for that?

Not even during LASIK when my eye was cut open and the flap lifted could I see the beam of the UV laser. Yet, I can see "blacklights" and UV LEDs because of their frequency bleed (that the laser doesn't have).


----------



## jtr1962 (Sep 7, 2004)

See this page. The lasers used in LASIK surgery mostly seem to have a wavelength of 193 nm. This is well below the minimum 300 nm response in that article. Yes, UV LEDs and blacklights do have some frequency bleed, but not enough to account for how brightly they appear to some people. I've heard that a dim purple UV LED can appear a blinding bright bluish-white to someone with extreme UV sensitivity. For me personally it isn't that extreme, but they appear somewhat brighter than they "should" according to the mcd specs (which are usually in the 100 to 300 range). I'd say I see a 300 mcd UV LED as closer to 1000 or so mcd.


----------



## Frangible (Sep 7, 2004)

I'm not disputing any of that-- I'm sure some people can see UV better than others. However, that is not proof of a new type of photoreceptor in the human eye.

The general guidelines of what human vision can percieve are exactly that-- guidelines.

Similarly, consider the hearing range "guideline" of 20hz-20khz. I can hear well beyond either boundry. It doesn't mean that I have two or three extra sets of specialized ears. Just that generalizations are sometimes incorrect for individuals.

Another (in)famous generalization is how many frames per second the eye can percieve. A lot of people say 30, or 60, or something. I can easily percieve up to about 120ish. What new structure do MY eyes have? /ubbthreads/images/graemlins/smile.gif


----------



## jtr1962 (Sep 7, 2004)

Maybe then one of the three existing photoreceptors is sensitive down to 300 nm, at least in some people? I'm not saying a fourth photoreceptor exists either, just that it sounds somewhat better than any other explanation so far. It also makes sense from an evolutionary standpoint-why should a good part of the solar radiation coming to earth be outside the so-called visible spectrum, especially the more energetic wavelengths?

Regarding your perception of high frame rates, I think this may have more to do with the brain than with the eye. The photoreceptors have a fast enough response time that images of well over 100 fps cause them to send a discontinuous signal up the optic nerve. In most cases, the brain says this is continuous motion. In others, there is still some perception of flicker. To me 60 fps in very noticeable, and I still notice that something isn't quite right at 85 fps. I also have problems when under fluorescent lights with a magnetic ballast (this is 120 Hz flicker). For the majority these perceptions don't exist. For you and I they do. I also hear somewhat above 20kHz, mostly thanks to the fact that I never damaged my hearing by listening to loud music. Not sure about the other end (low frequencies). This may in fact be physiological-some of the small hairs in your ears which vibrate to sound may be shorter and longer than is usual. Thus, they receive higher and lower frequencies, respectively. Maybe a similar phenomenon exists in the eyes of UV-sensitive people where a photoreceptor responds to much shorter wavelength light than normal, and also the lens is thinner than normal to let this UV light in (perhaps these conditions tend to occur together). On the other hand, if _everyone_ can see UV with their lens removed, then I think the fourth photoreceptor theory is more plausible.


----------



## NewBie (Sep 7, 2004)

[ QUOTE ]
*jtr1962 said:*
Maybe then one of the three existing photoreceptors is sensitive down to 300 nm, at least in some people? I'm not saying a fourth photoreceptor exists either, just that it sounds somewhat better than any other explanation so far. It also makes sense from an evolutionary standpoint-why should a good part of the solar radiation coming to earth be outside the so-called visible spectrum, especially the more energetic wavelengths?

Regarding your perception of high frame rates, I think this may have more to do with the brain than with the eye. The photoreceptors have a fast enough response time that images of well over 100 fps cause them to send a discontinuous signal up the optic nerve. In most cases, the brain says this is continuous motion. In others, there is still some perception of flicker. To me 60 fps in very noticeable, and I still notice that something isn't quite right at 85 fps. I also have problems when under fluorescent lights with a magnetic ballast (this is 120 Hz flicker). For the majority these perceptions don't exist. For you and I they do. I also hear somewhat above 20kHz, mostly thanks to the fact that I never damaged my hearing by listening to loud music. Not sure about the other end (low frequencies). This may in fact be physiological-some of the small hairs in your ears which vibrate to sound may be shorter and longer than is usual. Thus, they receive higher and lower frequencies, respectively. Maybe a similar phenomenon exists in the eyes of UV-sensitive people where a photoreceptor responds to much shorter wavelength light than normal, and also the lens is thinner than normal to let this UV light in (perhaps these conditions tend to occur together). On the other hand, if _everyone_ can see UV with their lens removed, then I think the fourth photoreceptor theory is more plausible. 

[/ QUOTE ]

Okay, how about you order a couple of filters, that cut off everything below 350nm, and stack them, then see what you can see?


----------



## jtr1962 (Sep 8, 2004)

Actually the cutoff would have to be closer to 380 nm since that's where most sensitivity tables end, but the theory is sound. I would imagine full sunlight to be a good place to test this. First try the filter behind a few panes of glass (which should filter all UV). Everything should look completely black. Next try the filter without the glass. A more UV sensitive person should see a brighter image, and at lower cutoff wavelengths.

Another thing I thought somewhat interesting looking at one of these tables is that the photopic sensitivity at 380 nm is 0.027 lm/W (i.e. practically no response for all intents and purposes), while the _scotopic_ sensitivity at the same wavelength is 1.001 lm/W (about the same as the photopic value for 410 nm). Going by the dropoff rate of the scotopic values, there should be some response down to about 350 nm for almost _everybody_. While we can debate relative degrees of UV sensitivity, the fact is that the assumed cutoff for visible wavelengths is well below the usually quoted value of 380nm to 400nm.


----------



## ZDP189 (Sep 8, 2004)

From what I remember of my neurophysiology classes, there are three types of cone cell and each has a response curve to various wavelengths. The curve peaks at one frequency, but is kind of bell-shaped and can pick up frequencies either side to a lesser degree. 'Fuzzy Logic' (capitalized) allows for the determination of the actual frequency and normalisation of intensity from the degree of overlapping response from 2 types of receptor.

UV and IR are defined by a sensitivity perception threashhold, and not a technical limit on the cell responsivity.

Therefore, yes, no doubt one can demonstrate an invitro response to UV, but no I do not believe it significantly affects intensity perception.

I believe the reasons you think UV lights are brighter than they should be are:

(1) Some materials frequency shift light by absorbing UV and radiating in the visable spectrum, e.g. some washing powders, fingernails, flourescent tubes and lots of otehr things.

(2) UV sources are not specific enough and do radiate in the visable spectrum. That's why UV blacklight tubes look purple and not black when operating.


----------



## udaman (Sep 8, 2004)

120hz flicker with magnetic ballasts? Umm, what about 60Hz? What about flicker with those crappy electronic ballasts, are you sure they are magnetic?

Seems like a bunch of placebo effects I see (pun intended) here. Like I said before in the other thread, being trained as an amateur photographer (well truth be known, the supposedly highly proficient, or at least believed by the majority of readers to be 'expert opinion' reviewers, can't even notice how poorly my Nikon 775 point 'n shoot digicam renders certain colors) I have rather refined needs as far as color accuracy, and notice deviations from those accuracies like a sore thumb...so to speak /ubbthreads/images/graemlins/smile.gif. I hate the ugly orange glow of Philips warm white 40w T12 linear fluorescent tubes, but I also don't care for the cool white because it's too bluish(lesser of 2 evils I suppose). Cool white perception that jtr speaks of as being something that is more usable, it seems to me to be more some believed placebo effect rather than reality.

I see just fine with halogen head lamps, all though I'd admit the bluish white HID light in an headlamp will show higher contrast with those reflective surfaces such as road signs and road paint designed to reflect light. However, much of the advantage of HID's comes from the fact they output significantly more light in total compared to the legally limited halogen bulbs. Given equivalent lumen outputs (you need approximately 3x the wattage of a halogen bulb to match the output of the legal HID headlamps), there is far less of an advantage. I see better with the warm white CE 23w CFL, and notice no advantage of 'cool white' 23w CFL, except the cool white has more annoying glare. While the glare is difficult to mitigate even with frosted envelopes, the metal halide HID's used for stadium and large venue lighting, is always more palatable to me because it appears whiter to the eye *and* has good color rendering capability, compared to any full spectrum fluorescent tube.

Many people can hear out to 22kHz, but beyond that are far and few between. I LOVE loud music, just not quite at the 120db levels that cause more severe hearing loss. The canon blast on some orchestral music pieces is just as loud as some ear splitting, ghetto blasting, rap music; but the shortness of exposure (if not too high a db level) as compared to a continuous, longer playing sound level, can do more damage to hearing. It also depends on what frequency the high db level sound is at, whether or not it is a single tone, or wide range of continuous loud sound. So this is not a good analogy, IMHO, sound is different from light, and degrees of damage imparted by length of exposure are different for light and sound.

It does not take much exposure to harmful light rays from an arc welder or acetylene torch to damage your eyes. The fact that Jtr or someone else can perceive certain wavelengths as being slightly stronger or weaker, may only be a very, very mild deviation from the norm. While in that group; some people maybe capable of discerning a difference, but never really pay attention, and therefore will not notice. Jtr may just notice the same sensitivities, more so that someone who is less observant. I am unconvinced in the absence of more concrete and compelling evidence.

[ QUOTE ]
It also makes sense from an evolutionary standpoint-why should a good part of the solar radiation coming to earth be outside the so-called visible spectrum, especially the more energetic wavelengths? 


[/ QUOTE ] 

Huh? Why? Evolutionary sense? That's a bit of a conundrum if you ask me, where is it proven that evolution follows a logical guideline in progression? Hehe, humans seem to be the antithesis of such a hypothesis; if you consider our present state of primative, uncivilized behavior. Why would evolution favor more energetic wavelengths over less energetic wavelengths?


----------



## jtr1962 (Sep 8, 2004)

[ QUOTE ]
*udaman said:*
120hz flicker with magnetic ballasts? Umm, what about 60Hz? What about flicker with those crappy electronic ballasts, are you sure they are magnetic?


[/ QUOTE ]
It's a 60Hz sine wave so it crosses zero twice every cycle. That's where the 120Hz flicker comes in. Yes, some electronic ballasts with small filter caps also produce some modulation at 120 Hz, but it's of a much smaller magnitude that the on-off behavoir of tubes on plain magnetic ballasts. I even found some modulation on a transformer-powered halogen desk lamp. BTW, I use a spinning disk with a strobe pattern to detect flicker. When it spins close to an integral fraction of 120 rotations per second (i.e. 60 rps, 40 rps, 30 rps, etc) you'll see a slowly moving pattern if there's flicker or modulation of any sort. And I'm 100% sure which lights have an electronic ballast and which don't.

[ QUOTE ]

Cool white perception that jtr speaks of as being something that is more usable, it seems to me to be more some believed placebo effect rather than reality.


[/ QUOTE ]
Actually, it's been pretty well documented that at light levels typical of indoors the scotopic sensitivity curve is partially valid. Here is one of many articles on the subject. Maybe you don't perceive as much of a difference because the glare you mentioned counteracts some of the improved contrast (just a theory). You don't happen to wear glasses or even contacts, do you? Glasses cause horrible glare (which is why I don't wear mine other than watching TV), and it's well documented that bluer light sources cause more refractions when passing through optics. That's why we have amber fog lights-microscopic water droplets are an optic of sorts.

Another thing is that CCT of non-broadband light sources is very, very misleading. I see metal halide lights in parking lots all the time now, and these generally come only two varieties, ~3500K and ~4100K, unless we're talking expensive stadium lighting. Anyway, the supposed 4100K variety looks as blue to me as the 6500K white LEDs used in the pedestrian crossing signs. Maybe for technical reasons the CCT is 4100K, but it sure doesn't look that way to my eyes. This is probably why "cool whites" look very bluish to you. The commodity ones with poor CRI are pretty far off the Planckian locus. Ditto for the warm whites. You need to compare two tubes of equal and good CRI to notice the effect I'm talking about. Try comparing better grade T-8 tubes. Common T-8 tubes seem to come in two varieties. The slightly cheaper ones have a CRI of around 78, which is suitable for utility areas I suppose although to me they make no sense when the better grade costs like $0.50 more. The better grade (i.e. SPX for GE tubes) has a CRI of 85 or 86. Try a comparison using these. If you do it with poorer quality tubes it's like comparing apples and oranges. In general light sources with higher CRI seem brighter, so maybe your warm white has a somewhat greater CRI than the cool-white, and that's another reason (besides glare) why you don't perceive a big difference. Also, side-by-side in the same fixture is a very poor way to compare lights. You need to look at what the light source illuminates, _not_ the light source itself. When I made my comparison in the store it was in a row of single tubes-the cool white was markedly brighter in terms of what it lit up. No placebo effect here. And I've had an almost instinctive revulsion to incandescent light from the time my eyes opened. I remember as a young child asking my mom "Why does a light bulb look like cocky?" I wasn't referring to the shape, either. /ubbthreads/images/graemlins/wink.gif

[ QUOTE ]
Huh? Why? Evolutionary sense? That's a bit of a conundrum if you ask me, where is it proven that evolution follows a logical guideline in progression? Hehe, humans seem to be the antithesis of such a hypothesis; if you consider our present state of primative, uncivilized behavior. Why would evolution favor more energetic wavelengths over less energetic wavelengths?

[/ QUOTE ]
Because more energetic wavelengths react more, and hence are easier to detect (i.e. you need fewer photons). And since UV comprises a large part of the solar spectrum, if we aren't able to "see" it, or at least the longer, less harmful wavelengths, then we lose a big evolutionary advantage over a species that can.

BTW, humans _are_ a big contradiction in the so-called evolutionary theory. We may be the only species that "evolves" enough to render our very environment uninhabitable. Another working theory of mine is that this won't be the first time, either. I suspect the inhabitants of the mythical Atlantis far exceeded even present-day technology, but their disregard for the effects of their technology on the planet caused the last Ice Age, which resulted in their surviving descendents being thrown back into the Stone Age. It's just a theory of mine, with as yet no supporting evidence. If we ever unearth computer chips which are half a million years old we may need to rewrite the history books.

For what it's worth I did find one apparent contradiction in the original article I linked to:

[ QUOTE ]

As suggested above, if the light source is inadequate, the performance of the human visual system is degraded in the purple area. That is why artists and museums like natural skylight (not sunlight). When the system is degraded by inadequate illumination, a color temperature less than 7053 Kelvin, the performance of the human system begins todegrade in the purple area. Further degradation causes a loss in the perception of blues. 


[/ QUOTE ]
This sounds all good and well until you realize that glass attenuates the very frequencies of UV which the author claims we need to perceive purples properly. Therefore, it is simply the higher color temperature of skylight, rather than the presence of UV, which makes the artwork in question look better. Of course, the author may still be correct about needing UV to perceive purples properly, but his theory needs to be tested outdoors. I'll say this much-I definitely notice how much more vivid purple and lilac flowers appear outdoors than indoors. And by indoors I mean in a room lit solely by sunlight during the daytime. Because window glass blocks most UV, the lack of UV is what accounts for the difference in perception. Whether this is due to an ability to see shorter wavelengths reflected from the flowers on my part, or because of fluorescence effects, is an open question.


----------



## jamesraykenney (Sep 12, 2004)

[ QUOTE ]
*jtr1962 said:*
See this page. The lasers used in LASIK surgery mostly seem to have a wavelength of 193 nm. This is well below the minimum 300 nm response in that article. Yes, UV LEDs and blacklights do have some frequency bleed, but not enough to account for how brightly they appear to some people. I've heard that a dim purple UV LED can appear a blinding bright bluish-white to someone with extreme UV sensitivity. For me personally it isn't that extreme, but they appear somewhat brighter than they "should" according to the mcd specs (which are usually in the 100 to 300 range). I'd say I see a 300 mcd UV LED as closer to 1000 or so mcd. 

[/ QUOTE ]

Same with me....
I have the UV Photon(new one) and the REAL UV photon III.
The photon 3 seems to put out a VERY white(but dim) beam to me(not purple AT ALL.)

BTW I read about this at least 20 years ago... Same article(this was before the internet) said that it was rumored that certain 'organizations' recruited people that had their lenses replaced so that they could see with 'invisible' lighting.


----------



## The_LED_Museum (Sep 12, 2004)

The 370-375nm UV Photons should generate a purplish white color - not bright at all though.
I guess this depends on the individual using it though.


----------

