• 2nd March 2021

AD Technical Blog: Image Sensor Colour Representation

This is a graph of the spectral sensitivity of the cones in the human eye. An “S” cone cannot determine the difference between bright 480nm light source and dim 450nm light source. The human brain uses the combined weighted outputs of these cones to determine which colour is being perceived. So, a 480nm light will stimulate the M and L cones in addition to the S cones.

Since our eyes perceive the relative weights of excitement of our red, green and blue cells as all of the other colours, we can trick our eyes into perceiving colours that aren’t there. In CCTV videography this is usually done using the relative weights of Red, green and blue light.

The CIE photopic luminosity function is a well-used standard that allows engineers to accurately convert absolute luminosity to the relative perceived luminosity at any particular wavelength.

Using the chart above, if we pick three points that are centred on the peak sensitivities for each cone cell and normalise those wavelengths against the photopic luminosity function we will get a measure of luminosity that would approximate white light for average human vision. While this would look white to a human observer, it would miss out a huge amount of the visible spectrum.

While the light would look white, objects and the environment doesn’t play by the same 3 wavelengths, absorption and emission spectra of materials can be exceedingly complex and limiting the window to 3 narrow bands of illumination may mean that objects appear darker or brighter than their surroundings (as the object is only being perceived with a specific wavelength). The ability of illumination to represent colours faithfully is more reliant on the broadband emission characteristics of the illumination source. This is referred to as colour rendition index, or CRI. And it crops up often in LED lighting applications as the phosphors used in LED’s often produce a very narrowband emission profile.

While CRI exists for Lighting applications, no such quantitative measure exists for sensing applications as most sensors are likely to experience a change in illumination during normal operations.

Instead of CRI, sensing applications use White balance as a system of compensating for lighting conditions. These are profile of relative gains to the red, blue and green channels to compensate for the lighting conditions. Image processors evaluate the relative strengths of the channels over the whole image to aim for a neutral colour, where reds and blue tones are balanced. However, in scenes where there is a large amount of red or blue light, the image processor may overcorrect, producing images of incorrect colour.

These challenging scenes can be compensated for in multiple ways, if the lighting and scene environment is known beforehand a white balance profile can be formed to correctly address the lighting conditions. Alternatively, the end user may opt to do their own post-processing to bring the image back to expected colours.

Infra-red flood illumination, often used in CCTV to aid in low-light performance, can further complicate the situation. As often IR wavelengths will present as red light in the image. Filtering out these wavelengths is easy, but defeats the purpose of illumination in the first place. Often in CCTV applications there is a Day/Night switch that performs 3 actions to keep the quality of the image intact – Disables an Infrared cut filter.

This filter is active during the day to filter out IR illumination that may enter the sensor from other sources, such as the sun – Enables the IR Illuminators – Switches to a Monochrome image.

As the scene is under Infrared illumination, recreating the illumination in the visible colour range doesn’t produce consistent results.

AD does not implement electronic Infrared-cut filters, as we otherwise offer a completely solid-state system, and the IR filters are mechanically actuated. This would reduce the lifetime of our LRU’s.

AD’s aerospace camera systems utilise a combination of illumination and sensors that maximise the linearity between red, green and blue pixels, enabling Infrared to appear with minimal colour cast. Enabling “always-on” IR illumination with minimal degradation to colour image performance.


An example of relative pixel efficiencies for an image sensor.