Search
Menu
Excelitas PCO GmbH - Industrial Camera 11-24 VS LB

Choosing a Microscope Camera: Factors to Consider

Facebook X LinkedIn Email
LAUREN ALVARENGA, EVIDENT SCIENTIFIC

As remote sharing becomes popular in the microscopy world, it is more important than ever to be able to translate the images that the eye sees into something that can be reliably visualized on a screen. Choosing the right digital microscope camera requires a basic understanding of imaging sensor types, how their performances differ, and the external factors that affect the images acquired.

Comparing sensor types

One of the first factors to consider when choosing a digital microscope camera is the sensor type. Charge-coupled device (CCD) sensors comprise a dense matrix of photodiodes that operate by converting light energy (photons) into an electronic charge. The interaction between a photon and a silicon atom in the device is stored in a potential well and then transferred across the chip through a horizontal register. The content of the horizontal shift register is read out before the next row is loaded. This produces an analog raster scan of the photogenerated charge from the two-dimensional array of photodiode sensor elements. The scan is used to generate an image.

An image of the crystallized essential micronutrients used in plant fertilizers, captured using a microscope camera. Iron and potassium are essential for the synthesis of chlorophyll and for maintenance of the chloroplast complex, where photosynthesis occurs. Though it is important to reproduce microscope images accurately, such images can still be beautiful. The proper camera can provide publication-quality images, whether for scientific journals or imaging contests. An Olympus Image of the Year Award 2020 submission. Courtesy of Karl Gaff.


An image of the crystallized essential micronutrients used in plant fertilizers, captured using a microscope camera. Iron and potassium are essential for the synthesis of chlorophyll and for maintenance of the chloroplast complex, where photosynthesis occurs. Though it is important to reproduce microscope images accurately, such images can still be beautiful. The proper camera can provide publication-quality images, whether for scientific journals or imaging contests. An Olympus Image of the Year Award 2020 submission. Courtesy of Karl Gaff.

CCD cameras were long the standard of the microscopy world. As scientific needs evolved, the delay in charge transfer and consequent slower frame rates became a barrier for imaging intracellular activities. These events require a frame rate of more than 100 fps.

To achieve this speed, researchers turned to complementary metal oxide semiconductor (CMOS) sensors. (CMOS refers to a manufacturing process and not to a specific imaging technology.) CCDs far outperformed early CMOS sensors, but by the early 1990s, advancements in CMOS design yielded chips with smaller pixels, reduced noise, and larger imaging arrays, making CMOS sensors more comparable to CCDs.

CMOS and CCD chips detect light through similar mechanisms that exploit the photoelectric effect. When a broad wavelength band of visible light contacts a crystallized silicon semiconductor, the silicon releases a variable number of electrons in proportion to the intensity of the light reaching the surface. Generally, the more light reaching a sensor, the more electrons the sensor will produce. The electrons are then stored until the acquisition period is finished. This is true for both CCD and CMOS sensors. A major difference comes when reading out the charge from each photodiode or pixel: CCDs transfer the charge horizontally to a register, whereas CMOS sensors convert each pixel’s charge, individually and immediately, into a voltage.

A major advantage of modern CMOS image sensors over CCDs is the ability to integrate processing and control functions. These features generally include timing logic, exposure control, analog-to-digital conversion, shuttering, white balance, gain adjustment, and initial image processing algorithms.

Green algae from the genus Xanthidium. The image was captured using a microscope camera and differential interference contrast (DIC) microscopy. A combination of water immersion, 138 images stacked in three layers, and Zerene Stacker software achieves a sense of depth of field. The DIC slider is pulled out halfway to achieve the orange-red color effect of fringes around the edges. An Olympus Image of the Year 2020 Award submission. Courtesy of Håkan Kvarnström.


Green algae from the genus Xanthidium. The image was captured using a microscope camera and differential interference contrast (DIC) microscopy. A combination of water immersion, 138 images stacked in three layers, and Zerene Stacker software achieves a sense of depth of field. The DIC slider is pulled out halfway to achieve the orange-red color effect of fringes around the edges. An Olympus Image of the Year 2020 Award submission. Courtesy of Håkan Kvarnström.

The most popular CMOS designs are built so that both the photodiode and readout amplifier are incorporated into each pixel. This means that the charge accumulated by the photodiode is converted to an amplified voltage inside the pixel and then transferred in sequential rows and columns to the analog signal-processing portion of the chip, where it is processed to form a digital electronic representation of the scene imaged by the sensor.

Teledyne DALSA - Linea HS2 11/24 MR

Factors behind camera choices

The photodiode, often referred to as a pixel, is the key element of a digital image sensor. Microscopes are used to observe very tiny details that are difficult to resolve optically. This means that smaller pixels, or more pixels, do not always improve resolution.

The problem arises if the structure being imaged is poorly matched to the structure of the chip onto which the image is projected. To determine the appropriate pixel pitch (the density of pixels in an area), the relevant factors are the numerical aperture, the total magnification of the optical system, and the spatial frequency of the sample.

A sensor’s Nyquist frequency, or the reciprocal of the pixel pitch of the sensor, should match the optical cutoff frequency of the imaging system. The following formula can be used to calculate the optical cutoff frequency, where NA represents the numerical aperture:

If the Nyquist frequency is lower than the optical cutoff frequency, a smaller pixel size may increase the overall pixel pitch and increase resolution. Otherwise, smaller pixels cannot increase resolution because the optical system’s point spread function already extends the light from the sample into a spot much larger than the pixel pitch.

Pixel size, however, is only a part of the sensitivity equation. The other common metric for measuring camera sensitivity is quantum efficiency, which is the ratio of incident photons into converted electrons. Although the quantum efficiency for modern cameras is incredibly high, it is not the sole influencing factor for increased sensitivity. Increasing the pixel pitch even slightly can significantly improve sensitivity.

A high signal-to-noise ratio is also crucial for reliable data collection. In some cases, with very bright samples, this is easy to achieve. In practice, the signal-to-noise ratio has both physical and technical limitations. The physical limitation stems from the statistical error in the number of photoelectrons generated in a sensor chip, which is determined by the sample brightness and camera sensitivity.

The technical limitation arises from dark current and readout noise, including electrical noise on the electronic circuit. Sensor cooling suppresses hot pixels on CMOS and scientific CMOS sensors, essentially eliminating dark current as an issue for most cameras with less than a 2-s exposure time.

Frame rate is another important factor to consider when choosing a microscope camera. For viewing a live image, a smooth transition from field to field provides consistency. During acquisitions, the frame rate enables researchers to image dynamic activities.

Most cameras are suitable for single captures, but applications such as pathology consultation and case conferences require frame rates greater than 30 fps for smooth live imaging to track rapid microscope operation.

Optimizing image quality

For some applications, image processing can be used to enhance images so that the visual information provided by the optical system better suits the needs of the imager. However, it is important to understand the implications of image processing. The intended goal of microscopy imaging is to produce an accurate representation of a sample rather than a desirable image.

A digital image straight off the camera sensor is referred to as a raw image. Although raw images are suitable for some applications, optical artifacts such as dust or uneven illumination can distort the image quality. One way to fix these distortions is to acquire a background image while imaging the sample. After setting the acquisition parameters, acquire an image without any sample in the field of view. The background image can guide corrections to remove the distortions from all subsequent images.

With clean, even backgrounds, an image can be aesthetically adjusted to optimize brightness and contrast. Adjustments should be conservative, however, and made while working in a copy of the raw image file so that any adjustments can be reversed if needed.

It is inappropriate to perform most aesthetic modifications to fluorescence images if the images’ intensity values need to be quantified, because some common processing techniques will alter the measurement results.

Choosing a microscope camera

In the end, the most important part of choosing a microscope camera is the ability to accurately reproduce what the eyes experience. Camera technology should be evaluated thoroughly based on the most important requirements of an application. Technology should be considered that meets the application’s specifications for illumination methods. Lastly, a microscope camera expert can answer questions and offer guidance through the final selection process.

Published: June 2022
ImagingMicroscopyCCDCMOScamerasPhotonic Fundamentalsmicroscope camera

We use cookies to improve user experience and analyze our website traffic as stated in our Privacy Policy. By using this website, you agree to the use of cookies unless you have disabled them.