The Angle on Optical Acceptance

In my last post – not the April Fools post – but the one before…  I described how pixel Fill Factor plays a role in the optical efficiency of an image sensor.

The Optical Acceptance Angle is directly related to Fill Factor but not very often discussed or specified. Misjudging this effect can impact image flatness and color quality in optical systems.

Consider the following optical system:

A lens is sketched, focussing on a simplified pixel array (micro lenses only). The tracing lines illustrate the center pixel (green) with decent incidence angles as well as the outermost pixel (red) with extreme (exaggerated) angles.

This wide distribution of incoming angles is essentially the issue a larger image sensor faces in an optical system.

 

If we look at the pixel in more detail and simplify the incoming angle distribution to the extremes we get the following pictures.

With increasing incidence angle more light rays fall onto metallization or neighbouring  pixels. As a result the pixel output signal will decrease with increasing incidence angle. Signal is “shared” between neighbouring pixels, degrading MTF and creating optical (colour) crosstalk.

To improve or resolve these non-idealities, pixel designers employ ray-tracing simulations.

The illustration below is an actual sensor with a high-end lens at infinity focus and f1.6 aperture. The light rays are shown as parallel bundles arriving with the extreme angles the lens will produce (upper and lower Coma rays). The resulting “focal cones” need to be considered when designing the pixel.

Center Pixel, system lens at f1.6. Optimized micro lens position. Center & Coma rays shown

Center Pixel, system lens at f1.2 - Optimized micro lens position. - Center & Coma rays shown. Pixel Acceptance Exceeded

 

 

 

 

 

 

 

The “effective aperture” of a pixel (for a given lens) is reached when further widening of the lens aperture (or: wider spread on Coma rays) results in lesser rates of signal increase on the sensor output. Each f-stop should double the output signal.

Designers can optimize a pixel, such that it is maximally responsive to this incoming focal cone for a lens with known Coma ray behaviour. Pixel size, metallization density and stack height do set physical limits, however.

Adding the physical extension of the image sensor (remember the “red” rays in the first figure) the above Coma ray range has an “offset” in the form of a non-zero Chief-Ray-Angle (“CRA”). For simplicity we can think of the CRA as the center ray of the focal cone.

The lens in this design shows narrowing Coma extremes off-normal but the default pixel still shows aperture reduction (left image). A typical way around this new problem is called “shifting” (shown on the right). Shifting is done for both colour filters and micro lenses and is measured as a function against the pixel optical axis. We show the edge pixels illuminated as above, but with a lens shift of ~1/10 pixel pitch. (Note that the optimized lens position for the center pixel was already shifted).

Edge Pixel, system lens at f1.6. Center-Pixel micro lens position. Coma rays at non-zero Chief Ray Angle.

Edge Pixel, system lens at f1.6. Optimized micro lens position. Coma rays at non-zero Chief Ray Angle

 

 

 

 

 

 

 

 

 

While these results are quite good (accepted aperture ~f1.2), the analysis was done for one lens series! One can do a bit better and show that lens families (e.g. high-end Digital Cinema lenses from one manufacturer) can be reasonably well serviced with this method. Using a different manufacturer or a different lens family can completely upset this design and result in colour distortions and lack of flatness again.

And this is why lens shifting is not usually the practice in a variety-rich environment like Machine Vision. A system can be optimized for a certain application, but not for all of them at once.

Gotta love variety,

Matthias

Matthias

About Matthias

Born in the early seventies in Northern Germany I graduated in Physics at the University of Ulm. I worked for the BOSCH automotive group in Stuttgart before joining DALSA in 2000. Since then I have worked on layout, design, project lead, architecture and program management for various DALSA CMOS image sensors. I enjoy creating sensors and seeing them through production and implementation at the customer end.
Posted on by Matthias. This entry was posted in Image Sensors, Machine Vision and tagged , . Bookmark the permalink.

Add Comment Register



Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>