Posted: Feb 26, 2016 11:17 pm
by Calilasseia
My understanding, courtesy of the relevant scientific research, is that colour perception depends upon a multiplicity of factors. First, it depends upon the wavelength of the light reflected from objects, that enters the eye, and which then triggers the requisite photoreceptor molecules. Second, it depends upon which photoreceptor molecules are present to be triggered.

However, even if we work with what might be termed a "standard" set of photoreceptors, colour perception is rather more involved than simple triggering of those photoreceptors by relevant wavelengths. This is illustrated by both the behaviour of colour photographic film, and the behaviour of early television cameras.

If you take a photograph of a scene in natural daylight, using a particular sample of film manufactured to reproduce colours faithfully under natural daylight, and then take a photograph of another scene, this time illuminated by a tungsten filament bulb, using that same film, you will notice a difference between the photographs after the film has been developed, and the positive prints produced. The scene photographed under tungsten illumination will feature an orange cast over areas of white. Likewise, using a film manufactured to reproduce colours faithfully under tungsten illumination, will result in natural daylight scenes having a blue cast over areas of white.

The same phenomenon affects early television cameras. Set up the camera to reproduce colours faithfully under one set of lighting conditions, and the moment that camera is moved to different lighting conditions, it will cease to reproduce colours with the same fidelity. The colours displayed on the television screen connected to that camera, will change constantly as the lighting conditions change when the camera is moved from point to point.

This is Part 1 of the requisite Horizon documentary covering this:



Parts 2, 3 & 4 are provided below:







Basically, what influences colour vision, is a combination of the spectral distribution of ambient light, and how much of that spectral distribution is reflected to our eyes by objects in our field of view. The relative proportions of red, green and blue light reflected remain constant across a wide range of spectral distributions, and these are integrated by the brain to produce colour constancy within those varying spectral distributions. Even spectral distributions that have some gaps in them, such as those generated by older fluorescent lighting, don't appear to fool the brain because of this integrative process, but they do impact heavily upon film - special fluorescent filters are needed to prevent weird colour casts appearing on daylight balanced film in an environment involving 1970s/1980s vintage flourescent lights. Only under extreme spectral distributions does the colour intergration system break down, such as monochromatic spectral distributions (which were useful in, for example, the Ole Seehausen experiments on sexual selection via colour in Lake Victoria Cichlid fishes).

The paper produced by Edwin Land (inventor of, amongst other things, the Polaroid camera), that presented the full details of the integrative process is this one:

Lightness And Retinex Theory by Edwin H. Land & John J. McCann, Journal of the Optical Society of America, 61(1): 1-11 (1971) [Full paper downloadable from here]

Land & McCann, 1971 wrote:Sensations of color show a strong correlation with reflectance, even though the amount of visible light reaching the eye depends on the product of reflectance and illumination. The visual system must achieve this remarkable result by a scheme that does not measure flux. Such a scheme is described as the basis of retinex theory. This theory assumes that there are three independent cone systems, each starting with a set of receptors peaking, respectively, in the long-, middle-, and short-wavelength regions of the visible spectrum. Each system forms a separate image of the world in terms of lightness that shows a strong correlation with reflectance within its particular band of wavelengths. These images are not mixed, but rather are compared to generate color sensations. The problem then becomes how the lightness of areas in these separate images can be independent of flux. This article describes the mathematics of a lightness scheme that generates lightness numbers, the biologic correlate of reflectance, independent of the flux from objects.


And the proof of the pudding from this paper, is that it has been applied to the business of automatic white balancing in cameras, and it works. Every digital camera in existence, still or video, uses this process to perform automatic white balancing.

So, colour as we perceive it, is actually an integrative phenomenon, relying upon a considerable amount of post-processing of the data by the brain. Although we can assign colours to wavelengths, on the basis that shining a light of a particular wavelength upon a white screen will produce the colour in question, our actual handling of colour in a xomplex environment is very much the product of our brains, and is aimed at ensuring colour constancy under a wide range of ambient spectral distributions - constancy that would be absent if we simply relied upon the raw signal data without the post-processing.

Which demonstrates that it pays to be rigorous when covering these topics. :)