HDR Sensors for Embedded Vision
Registration is free and takes less than one minute. Click here to register, and get full access to the Embedded Vision Academy's unique technical training content.
If you've already registered, click here to sign in.
See a sample of this page's content below:
By Michael Tusch
Founder and CEO
At the late-March 2012 Embedded Vision Alliance Summit, Eric Gregori and Shehrzad Qureshi from BDTI presented a helpful overview of CCD and CMOS image sensor technology. I thought it might be interesting to extend this topic to cover so-called HDR (High Dynamic Range) / WDR (Wide Dynamic Range) sensors. HDR and WDR mean the same thing –it’s just a matter of how you use each axis of your dynamic range graph. I’ll employ the common terminology "HDR" throughout this particular article.
I think that this is an interesting topic because many embedded vision applications require equivalent functionality in all real-scene environments. We know that conventional cameras, even high-end DSLRs, aren’t able to capture as much information in very high contrast scenes as our eyes can discern. This fact explains why we have rules of photography such as “make sure the sun is behind you”. Indeed, conventional image sensors do have problems in such conditions, but the industry has devoted significant work over many years to HDR sensors which extend raw capture capability far beyond what is available in conventional consumer and industrial cameras. The reliability of the image capture component is of course one key element of the overall system performance.
The dynamic range (DR) of the sensor is the ratio of the brightest pixel intensity to the darkest pixel intensity that the camera can capture within a single frame. This number is often expressed in deciBels (dB), i.e.
DR in dB = 20 * log10 (DR)
The human eye does very well and, depending on exactly how the quantity is measured, is typically quoted as being able to resolve around 120-130 dB in daytime conditions.
Image sensors are...