Search
Menu
Edmund Optics - One Stop Shop 11/25 LB

Image Sensor Design Innovation Shines When the Lights Go Down

Facebook X LinkedIn Email
Sensor design must keep pace as applications soar for low-light imaging.

By Lauren LeCours and Joe Kuczynski

Speed is of the utmost importance for many imaging and machine vision tasks. This metric is often used in the context of image capture rates and acquisition times. For a range of applications, including industrial inspection and production-line quality assurance, high-speed imaging is paramount to ensure that captured images yield as much usable information as possible for end users. In this way, ultrafast imaging is an essential tool for informed decision-making.

A close-up of a single-photon avalanche diode (SPAD) array with a microlens. Courtesy of Pi Imaging.


A close-up of a single-photon avalanche diode (SPAD) array with a microlens. Courtesy of Pi Imaging.


End users may be able to optimize image acquisition speeds for a particular application. With modern image-proces- sing software and AI tools, real-time image analysis has become viable across a wide range of industrial settings.

Unfortunately, a host of other factors can affect imaging system performance — including many that are more difficult to optimize or influence.

Lighting is one such factor: Poor lighting can render an obtained image unusable and even compromise components and equipment in imaging systems. While industries such as machine vision, entertainment, and horticulture already rely on industry-standard or customized lighting and illumination, solutions providers relying on other applications are working to meet the challenge of inconsistent and variable lighting.

The IMX735 CMOS image sensor from Sony Semiconductor Solutions is designed for automotive cameras. It offers 17.42 effective megapixels for sensing and recognition performance, contributing to safer automated driving systems. Courtesy of Sony Semiconductor.


The IMX735 CMOS image sensor from Sony Semiconductor Solutions is designed for automotive cameras. It offers 17.42 effective megapixels for sensing and recognition performance, contributing to safer automated driving systems. Courtesy of Sony Semiconductor.


Much of the progress is taking place at the sensor level. For applications in transportation, science, and research, efforts are underway to improve the sensor systems that are needed to enable fast, highly precise imaging in challenging and low-light conditions.

The power of CMOS

Though their spectral range can often be extended into the NIR, commercial silicon CMOS image sensors show a spectral response from ~400 to ~700 nm, which is equivalent to the human eye. The addition of a filter to a CMOS sensor can further influence its performance. For example, an IR cut-off filter may be incorporated, while monochrome and color CMOS sensors offer different capabilities that are preferred for certain applications.

CMOS sensors typically operate in two forms: rolling shutter and global shutter. Rolling shutter sensors find use in passively illuminated settings, whereas global shutter solutions are more commonly used in active illumination applications. By sampling all pixels simultaneously and reading out the entire image frame, global shutter sensors capture images of fast-moving objects free of distortion, unlike conventional rolling shutters, where pixels are read out line by line. When coupled with a lighting system, they may also enable users to significantly reduce power consumption.

Switzerland-based STMicroelectronics has sold more than 1.5 billion global shutter CMOS sensors. The company’s devices are engineered to excel in the NIR and designed to operate in dim environments.

In the company’s ST BrightSense sensor, each pixel within its matrix is covered by a microlens and equipped with a light pipe. These structures concentrate and optimize photon capture by directing light through the module lens down to the photodiode, increasing sensor light sensitivity and maximizing the amount of light received. When sensing light at a wavelength of 940 nm in the NIR, the smart control of the photon trajectory is key to obtaining the best sensitivity while preserving sharpness. Keeping pixel crosstalk low is a challenge, especially at this wavelength. Issues arise related to the modulated transfer function with crosstalk between pixels, which can hinder the sharpness that global shutter CMOS sensors can deliver.

The company’s ST BrightSense global shutter CMOS sensors use advanced backside-illuminated pixel technology with full pixel isolation to ensure high image sharpness and capture fine details. Backside-illuminated sensors increase quantum efficiency, allowing light to reach the charge-conversion layer more directly. Additionally, the high sensitivity of these sensors boosts low-light performance and enables fast image capture and response times. These improvements are demonstrated in applications such as obstacle avoidance in mobile robots and user recognition in personal electronics. The products are also commonly used in barcode scanners.

According to Nicolas Roux, product line manager for imaging camera sensors at STMicroelectronics, using a global shutter CMOS image sensor requires that information be stored within the pixel. This necessitates the use of memory nodes. It can be challenging to ensure that the memory nodes do not influence sensitivity, which is critical for effective low-light imaging.

ST embeds these memories on the same layer as the photodiode. This enables a two-layer approach, with the top layer for pixels only and the bottom layer dedicated to circuitry for analog-to-digital conversion, image-signal processing, output interface, or even local computer vision. This architecture allows different technological process tuning, with the top layer sharply optimized for light sensing and the bottom layer set for low-power and high-density circuitry.

A high-dynamic-range (HDR) scene comparison between Forza Silicon’s ISIE19 CMOS sensor (top) and the previous-generation ISIE11 CMOS sensor (bottom). Courtesy of Forza Silicon.
A high-dynamic-range (HDR) scene comparison between Forza Silicon’s ISIE19 CMOS sensor (top) and the previous-generation ISIE11 CMOS sensor (bottom). Courtesy of Forza Silicon.


A high-dynamic-range (HDR) scene comparison between Forza Silicon’s ISIE19 CMOS sensor (top) and the previous-generation ISIE11 CMOS sensor (bottom). Courtesy of Forza Silicon.


And by using vertical storage for the memory nodes, Roux said, users can ensure their global shutter CMOS sensors are effective not only in low-light conditions but also in situations where the light is regularly changing.

“The goal is to get a global shutter that does not have a lot of noise,” Roux said. “Typically, for global shutters you may have some of what we call a ‘dark noise.’ So, you have to create a global shutter that is very good in dark noise — which is not easy.”

The circuitry embedded in the bottom layer of the sensor — both the analog-to-digital conversion and the image processing — is also valuable for reducing noise and improving the signal-to-noise ratio.

The intriguing rise of CMOS sensors

The story of the emergence of CMOS sensors, and of the different iterations that would ultimately characterize their evolution, is rooted in cost-efficiency. In the late 1970s and 1980s, CMOS sensors became a more cost-effective option than CCD sensors. Despite offering inferior readout noise values and image quality compared with CCD sensors, their favorable price point contributed significantly to their use.

Later, CMOS’ scalability benefits led to further increased adoption. This factor itself has evolved, but it remains important today. CMOS manufacturing techniques hold the key to upscaling in the microelectronics and optoelectronics sectors.

Even as CMOS sensors have become commonplace, companies including Oxford Instruments Andor continue to develop and market CCD cameras because the strengths of these sensors remain highly relevant for certain applica- tions.

“Oxford Instruments Andor still sells these deep-cooled CCD cameras,” said Alan Mullan from Oxford Instruments. “Customers need these cameras with very low dark current that are very comfortable working at extended exposure times for applications such as luminescence. And then the downsides of the CCD camera and the readout speed aren’t important, since you are only taking an acquisition every 20 minutes. You are accumulating the signal over that length of time, and the most important thing for detection is, ‘How low is this dark current?’”

An image captured by an HDR scientific CMOS (sCMOS) camera. Courtesy of Excelitas (top). A backside-illuminated sCMOS sensor used in Oxford Instruments’ Sona camera for fluorescence microscopy applications. Courtesy of Oxford Instruments (bottom).
An image captured by an HDR scientific CMOS (sCMOS) camera. Courtesy of Excelitas (top). A backside-illuminated sCMOS sensor used in Oxford Instruments’ Sona camera for fluorescence microscopy applications. Courtesy of Oxford Instruments (bottom).


An image captured by an HDR scientific CMOS (sCMOS) camera. Courtesy of Excelitas (top). A backside-illuminated sCMOS sensor used in Oxford Instruments’ Sona camera for fluorescence microscopy applications. Courtesy of Oxford Instruments (bottom).

As a basic building block of image quality, pixel size is a core consideration. In addition to image resolution, pixel size has a direct bearing on imaging sensitivity, given the trade-off between the two. A smaller pixel size boosts image resolution at the cost of light.

The result, in low-light environments, is images with larger amounts of noise.

Regarding the prevalence of CMOS image sensors, particularly in low-light imaging, the manufacturability of CMOS sensors with pixel sizes significantly smaller than those achievable with CCD sensors creates an interesting dynamic. The higher pixel sizes associated with CCD sensors offer lower readout noise values and low dark current. Essentially, larger pixels on the sensor offer the sensitivity needed for effective low-light imaging.

At the same time, users cannot discount the efficacy of CMOS sensors. Imaging cameras may experience high levels of illumination in daylight, but they will need to compensate for low- or no-light conditions at night. Here, CMOS sensors that simultaneously capture color and infrared in a single frame offer the necessary flexibility.

Oxford Instruments WITec GmbH - Raman Microscope MR 12/25

“CMOS sensors have broadened access to low-light imaging compared to costlier CCDs,” said Uldric Antao, business development manager at Forza Silicon. Antao points to single-photon avalanche diodes (SPADs). These components enable photon counting with binary detection, making them highly effective in low light but less suitable for bright conditions, he said.

Automotive applications

The automotive industry is one sector that must meet the challenge of inconsistent lighting head-on. Automotive lighting considerations extend from the interior of the vehicle to the exterior, since images from both must be captured to ensure drivers are provided with as much information as possible about their surroundings. Nighttime driving, in which operators pass through areas of varied illumination, is a particular challenge.

A goal in automotive imaging is to capture better images in low-light situations without tapping into systems that will rapidly drain batteries or consume excessive power. CMOS sensors are adept at capturing images in low light and at rapidly adjusting to settings that become low light.

For Sony Semiconductor, a longtime developer of sensors for high-quality image acquisition, including in automotive applications, parallels exist between the current surge in automotive imaging and the boom in smartphone imaging that kicked off in the mid-to-late 2000s. For smartphones, the company’s CMOS sensors balance sensitivity, dynamic range, resolution, and power consumption.

According to Yoko Yasukouchi, senior manager of public relations at Sony Semiconductor, the next big market boom will come from automotive. Outside of industry applications or special surveillance situations, many imaging systems are trying to meet the same standards as the human eye. This means being able to see both the stars in a dark sky and images in bright sunlight, which is crucial for autonomous driving.

“Especially in automotive, image sensors are expected to have this wide dynamic range ready at all times to allow cars to drive into a dark parking garage, the same way [they operate in] a tunnel, and see inside and outside,” said Jens Landgraf, deputy head of Sony Europe Design Centre for Sony Semiconductor. Such changes in illumination increase the need for high dynamic range in these systems. The likelihood that levels of illumination will differ on all sides of the vehicle — with contributions from headlights, brake lights, and areas without lighting — further underscores the need for more flexible sensors.

CMOS sensors can improve the image captured inside of a car as well as outside of it. Inside the car, the system is often paired with IR illumination and can detect signs of drowsiness or a medical emergency that could lead to a catastrophic crash, offering major value. Sensor-based systems that remind a driver to take their eyes off the phone and pay closer attention to the road could also find immediate use.

Two-dimensional structured illumination microscopy images of a mouse brain slice with a wide field of view. The spatial resolution is ~130 nm. Courtesy of Excelitas.
Two-dimensional structured illumination microscopy images of a mouse brain slice with a wide field of view. The spatial resolution is ~130 nm. Courtesy of Excelitas.


Two-dimensional structured illumination microscopy images of a mouse brain slice with a wide field of view. The spatial resolution is ~130 nm. Courtesy of Excelitas.

As it relates to CMOS sensors for automotive applications, systems designers favor global shutter varieties. “In industrial and IR-based applications, global shutters allow synchronization with the IR light source, which lets you pulse the IR light precisely during image capture, reducing system power consumption while maximizing image quality,” Landgraf said. “But the downside of global shutter is that, especially for automotive, you have a high temperature operation range. So in automotive, all our sensors have to be qualified for operation from −40 °C to 125 °C.”

For exterior sensing, rolling shutter sensors have proved to be highly effective because they perform reliably across temperature extremes. This is a major advantage in automotive environments, and enables efficient, low-noise image capture. Rolling shutters also make multi-capture high-dynamic-range methods more cost-effective, combining exposures directly on-chip with minimal memory requirements.

Today’s automotive sensors use dedicated libraries designed for these temperature ranges to ensure proven safety and effectiveness. Additionally, because of the inherent safety requirements of the automotive industry, sensors must demonstrate reliable long-term performance without the need for maintenance.

Scientific CMOS

sCMOS (scientific CMOS) sensors burst onto the scene in the early 2000s as the result of an industry collaboration between Fairchild Imaging, Andor Technology, and PCO Imaging. Using standard CMOS processes for fabrication, sCMOS features an innovative pixel architecture that achieves a distinct combination of low noise and high speed. This architecture enables sCMOS sensors to achieve significantly lower readout noise than CCDs (~3 to 4 electrons) and conventional CMOS sensors (~10 electrons), while still achieving the high frame rates necessary for scientific imaging and vision applications.

sCMOS is purpose-built for research and night vision and is commonly used in light microscopy; its development has enabled current performance levels in microscopy techniques such as localization microscopy and superresolution microscopy. Localization microscopy collects positions of molecules (from unsharp images), rather than intensity alone, and achieves resolution beyond optical limits. For structured illumination, sCMOS improves resolution roughly 2×. sCMOS sensors have also enabled advancements in the use of fluorescence microscopy for the long-term observation of living organisms.

However, the wide range of optical microscopy and imaging techniques benefiting from sCMOS sensors and cameras does not imply that these sensors are without limitations. “One problem of the sCMOS cameras has been that the dark current and the thermal noise of the camera are much higher than those of a CCD camera,” Mullan said.

Still, said Excelitas’ Gerhard Holst, a member of the development team for the first sCMOS sensor, the combination of low readout noise, high frame rate, and high resolution has emerged as a game-changing resource for a range of scientific imaging pursuits.

Challenges

Imaging trends often follow a similar formula: Without increasing the optical size, aim to provide smaller pixels. In the same area where light is captured, render smaller pixels. Some of today’s techniques for low-light imaging combine pixels using so-called binning or coupling, which reduces resolution but still produces a typically acceptable low-light image.

Though CMOS sensors remain a dominant technology, some areas of advancement are stalling. Notably, according to Landgraf, CMOS imaging technology is reaching its limits in terms of pixel size and quantum efficiency.

This underscores that, fundamentally, quality imaging in low-light and NIR conditions can be challenging. Often, improvements in poor lighting come at the cost of resolution or color fidelity. In low-light situations, CMOS sensors require higher quantum efficiency and larger photodiodes to capture more light. CCD sensors, with larger pixels, are limited by low readout speeds and are also energy-hungry devices.

Reducing noise can be a struggle as well. With CMOS sensors, temporal filtering can accomplish this by averaging frames, capturing and comparing 120 fps. However, faster readout can lead to higher power consumption, which may generate additional noise. As with many technologies, the issue is one of trade-offs.

SPAD camera imaging is an intriguing solution, as these cameras gather a large amount of information from a single photon. However, these systems are still too large in terms of physical size and pixel size. They also struggle to adapt to situations with inconsistent lighting.

The future of CMOS

Advancements in CMOS sensors will continue to make low-light imaging systems more compact and affordable. A smaller sensor can also allow more room to integrate other technologies to further reduce noise and improve reaction times.

AI and increased on-chip functionality, driven by stacked architectures, will provide an additional boost to this technology. Stacking technology allows for a decoupled development process where the pixel die can be developed independently before integration with the logic die, which performs specific functions on the chip. It also enables flexibility for users to modify functions while still getting the same pixel performance from the top wafer.

Landgraf predicts that stacking could even help to leverage AI operations in CMOS sensors. “There are concepts of stacking memory to the image sensor that could allow for a little edge-AI weapon where you can then run neural networks on an image sensor,” he said.

“By using an attached memory and having a kind of processing unit on the image sensor to output just what the sensor sees rather than the full video, you just get information like ‘person,’ ‘object’ … depending on how you load neural networks on that chip.”

Published: December 2025
FeaturesCMOS image sensorsOpticslow-light imagingCMOS sensorsSPADback-light illuminationglobal shuttersCMOSSTMicroelectronicsCMOSCCDimage sensorsImagingsingle-photonOxford InstrumentsSensors & DetectorsJoseph KuczynskiLauren LeCoursindustrialTeledyne

We use cookies to improve user experience and analyze our website traffic as stated in our Privacy Policy. By using this website, you agree to the use of cookies unless you have disabled them.