Coming from a Machine Vision World, “Lux” and Human Vision have been a mystery to me for a long time. Thankfully, we have had some customers in the application of human vision and I’ve been able to remedy this situation. Tell me what you think…
Following the Spectral Cognisance Series, consider the following graph:
The black shows the now familiar, 3500K black body emission spectrum (halogen light source), the blue – a silicon sensor Quantum Efficiency curve. The red depicts a photoptic filter curve applied to “Emission Spectrum* Sensor QE”. The photo-optic response curve is the basis for light power measurements in “Lux” and takes into account the physiological sensitivity curve of the human eye or “human QE”, if you like.
We can think of the blue curve as the “machine response” to the incoming spectrum and the red curve as the “human response”. It is obvious the two are not the same.
This ties in nicely with the previous observation in “When Red turns Black”, where we found that human colour perception requires a silicon sensor be strongly IR filtered to achieve a reasonable (unprocessed) colour reproduction.
Once we try to take the next step and make a sensor behave like “real human vision” we have an entirely new problem: Lux.
The unit Lux is nicely defined in Wikipedia:
In other words: “light power per area” or “W/cm2”, as we spectrally insensitive scientists say… with one key difference: Lux allows itself the luxury (pun intended) of defining exactly what light it likes: the same as the human eye. The photoptic spectrum!
This spectrum is centered at ~560nm and falls off quickly towards ~450nm in the blue and ~650nm in the red.
From an image sensor perspective, humans basically see “green”! To prove that, see below the comparison between a typical “Green” image sensor colour filter (IR cut already included) and the photoptic spectrum:
So engineers use RGB filters, scientists measure single wavelengths in µW/cm2… but how will the poor photographer or cinematographer deliver colour to humans?
I arrived at the conclusion that specific optical filters are the best solution.
Below is an example of the HOYA CM500S and the BG40 spectrum; two filters commonly used for human vision imaging applications.
Again you can see that filtering is still not 100% accurate, but if we consider that the blue emissions of the light source are weak to begin with, the HOYA CM500S looks like a good fit. The BG40 follows more closely the red shift in a darker scene (as the human eye has different spectral behaviour in the dark) but is still a reasonable representation.
One challenge I have is that customers often ask for specifications in “lux/s”.
I have taken to just providing our measured results from the sensors response to a lux-spectrum calibrated lighting system, hoping that other sensor suppliers are doing the same in their performance claims.
Why do I say that? Well, picture this.
If I use measurement equipment that supplies a photoptic spectrum and measures the light power with a photoptically calibrated Lux-meter I might get an appropriate response from the meter and the sensor.
Supplier B uses a white light source (and you remember that “white” can mean anything) and a Lux-Meter. I would wager that the Lux (light power) meter has a filter that only transmits the proper spectrum that the human eye would see as well. The light source however has a tremendous amount of red energy that creates signal in the sensor, but not in the Lux-Meter. Hence Supplier B would measure tremendous responsiveness on his sensor (high sensor signal, but low Lux-power) while I get only Lux-Signal for Lux-Power.
After reading this blog series you may forgive me when I don’t typically offer an explanation around why we prefer to express Responsivity using the µW/cm2 and narrow-band lighting method.
Now, if I am not mistaken, this concludes my series on image sensor spectral behaviour. Let’s see what I write about next.
Thank you for following me through on this series!