Spectral Cognisance 2 – How White are your Whites?

In my first post in this series I started at the beginning of this story. We have seen that the optical path and the sensor itself modulate the effective spectrum that is available to generate signal for an image sensor. Filtering, in the broadest sense, can occur through optical materials and in the absorption properties of the sensor. There is still more to discuss on this topic, and I will address it in a later post.

Today, I want to focus on the contribution of the source itself – the emission spectrum. I had only left it at a short comment of “3500K black body”. What does this actually mean and what does it tell us?

Compare the following data, showing three possible illumination sources:

Emission Spectra

Figure 1: Black Body, LED and XENON Emission Spectra

Data Sources (in order):

  1.  Teledyne DALSA simulation
  2. http://www.luxeonstar.com/v/vspfiles/downloadables/DS64.pdf
  3. http://zeiss-campus.magnet.fsu.edu/print/lightsources/xenonarc-print.html

Humans look at all 5 spectra and call the light “white”.

Sure, a light bulb at 3500K has a bit more yellowish tone than a fluorescent tube at 5000K and a PC monitor set at 6500K would definitely exhibit some bluish hues. But I would only really notice that in direct comparison.

On the other hand, an image sensor is much more sensitive and “sees” these differences more clearly. Remember the QE curve shown in the last post:

Photon EmissionFigure 2: Photon Emission and Silicon Absorption Spectra.

Silicon-based, front-side illuminated CMOS image sensors are most sensitive around 600nm with photon absorption dropping to one-half of the peak at ~400 and ~800nm. Beyond 800nm, silicon becomes less sensitive but signal can be generated up to ~1100nm.

Imagine for a moment you use a light power meter, set at 600nm center wavelength. As explained earlier, this only means a calibrated gain is applied, no photon is actually filtered out.

  • Silicon QE is roughly centered on 600nm and might result in a decent measurement.
  • Black Body emission at 3500K emits ~10x more photons above 600nm than below 600nm (in the valid absorption range of silicon), so the 600nm setting is clearly off-center.
  • Black Body radiation at 6500K shows the opposite effect, with emission at 400nm 2-3x higher than at 800nm. This may result in a more balanced spectral response from the sensor when set at 600nm.
  • White LED have a high peak in the blue, producing over-proportionally many blue photons. Our power meter would apply an average gain that is set too far in the red.
  • XENON emission is nice and flat in the visible range but shows strong peaks in the red. This likely results in a scenario similar to a Black Body between 3500K and 6500K.

Using a light power meter as an example, we can conclude that a comparable power measurement between these light sources would be very difficult.

But why does this matter for an image sensor?

First, an image sensor doesn’t care! Much like the Power Meter, it will collect photons as they arrive (depending on its pixel size and Fill Factor), convert them to electrons (depending on the QE curve), store the electrons and convert them to a voltage (based on the charge conversion gain) and then to a digital number (based on the ADC conversion gain).

In this chain it only matters how many electrons eventually show up for conversion. The sensor doesn’t care (or know) which wavelength the according photon had.

OK then, so why bother?

Two reasons:

  1. Some sensors DO know the (approximate) color of the photon – these are colour sensors (e.g. with an RGB Bayer pattern overlaid on the pixel matrix)
  2. Applications often care what photon energy they use to detect specific effects (e.g. red/infrared in food inspection or blue lasers in your Blu-Ray player). Here the emission and absorption spectra are of utmost importance to controlling wavelength and sensitivity of the system.
  3. There are image sensor artefacts that respond differently to different wavelengths. In a future post we shall discuss how wavelength can affect MTF (spatial resolution) and Shutter Efficiency (temporal resolution) of a sensor.

I will discuss the impact of spectral content on image sensor colour reproduction in my next post “When Black turns Red”.

Until Then,

Matthias

 

Matthias

About Matthias

Born in the early seventies in Northern Germany I graduated in Physics at the University of Ulm. I worked for the BOSCH automotive group in Stuttgart before joining DALSA in 2000. Since then I have worked on layout, design, project lead, architecture and program management for various DALSA CMOS image sensors. I enjoy creating sensors and seeing them through production and implementation at the customer end.
Posted on by Matthias. This entry was posted in CMOS, Image Sensors, Machine Vision and tagged , , , , . Bookmark the permalink.

Add Comment Register



Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>