Hyperspectral imaging is widely favored for a broad range of applications in the industrial, life sciences, and space imaging sectors. The technology combines the capabilities of imaging and spectroscopy in a single instrument, and is differentiated from traditional RGB imaging, which provides three channels of visible colors across an image. By contrast, hyperspectral imaging provides a complete spectrum with tens or even hundreds of bands, usually extending outside of the visible and in some cases even beyond the NIR band (Main Image). It enables users to obtain valuable spectral information as a result, which can be used for image classifications, real-time detection, and surveillance. Even as technological advances have led to the increased commercialization of hyperspectral instruments across industries, hyperspectral imaging today does not find widespread use in land-based agricultural applications. Its primary application within agriculture and ecology remains large-scale spectral imaging data gathering using flight-based instruments. Nonetheless, hyperspectral imaging is indeed finding an expanding role for applications in the plant sciences, ecology, and agriculture. For example, different plant tissues exhibit signatures in the visible and NIR that correlate with plant health, growth rates, and crop yields. Scientists can deploy hyperspectral imaging techniques to gain insights into leaf chlorophyll content values, nutrient status or deficiency, fruit ripeness, and the presence of stressors such as drought, disease, and pests. Main Image: Three-channel RGB dataset (left) versus contiguous hyperspectral data. Courtesy of Living Optics. Courtesy of Living Optics. One approach to hyperspectral imaging, snapshot live video hyperspectral imaging, brings the benefits of these crop monitoring techniques out of the laboratory and into the field. With this method, users can perform live analyses and interpretations for handheld use cases. It is now expected that this practice will lead to advances in quantifying the effects of environmental factors and human activities on biodiversity. Limitations of current spectral imaging systems A bottleneck to the adoption of hyperspectral imaging for handheld use cases in agriculture and ecology involves the limitations of camera systems on the market. Many imagers struggle to provide real-time images for live-capture assessments due to the technology's post-processing requirements. NASA developed the first spectral imaging devices for environmental monitoring in the 1970s and 1980s, and the airborne or space-based systems that have traditionally been used to perform spectral imaging for plant and vegetation monitoring have primarily used multispectral or line-scan hyperspectral systems to image large areas. While multispectral systems typically image using a smaller number of wide bandwidth channels, hyperspectral cameras produce narrow, generally contiguous spectra. Consequently, hyperspectral imaging is more frequently used for experimentation and application discovery, especially when the specific wavebands of interest are unknown or when a range of current and future applications are to be investigated with the same camera system. Line-scan hyperspectral imaging systems provide high spectral resolution, using multiple exposures to gather hyperspectral images. These systems use an imaging slit at the focal plane of an objective lens with relay optics and a dispersive element in front of the sensor (Figure 2). The slit “images” the scene with one line across the detector with the dispersive element spreading the light across to cover the other axis based on wavelength. Figure 2. Comparison of line-scan and snapshot hyperspectral data collection. Courtesy of Living Optics. This approach requires the slit to be scanned across the scene, repeating the process line by line to build a complete hyperspectral image. For airborne or space-based systems, scanning is typically done using the inherent motion of the platform. For ground-based systems, it is possible to replace the motion using a mirror or scanning table. Although line-scan hyperspectral imaging delivers high spectral resolution, scanning across the scene and combining many frames limits the system's overall speed. These systems are also inflexible and generally better suited to static scenes with fixed lighting conditions and a more predictable camera motion. Video-rate snapshot hyperspectral imaging Neither multispectral nor line-scan hyperspectral imaging offer the combination of performance and practicality required for widespread field use in agricultural and/or ecological applications. As an alternative to the line-scan hyperspectral approach, the method of snapshot hyperspectral imaging captures all required data in a single exposure, enabling video-rate live assessments. There are multiple snapshot hyperspectral imaging methods, most of which map spectral bands — either single bands or multiplexed bands — to different positions where they are collected by one or multiple sensors. While snapshot systems typically trade faster output rates for lower spectral and spatial resolution, advances in snapshot hyperspectral imaging technology have overcome constraints to resolution as a limiting factor. These breakthroughs today make these systems beneficial for many applications including those in agriculture and ecology. Instantaneous hyperspectral data, provided by snapshot technology, offers a unique opportunity for targeted in-situ measurements and real-time assessments without the required motion or increased optical complexity of line-scanned systems. Plant and ecological monitoring with video-rate snapshot hyperspectral imaging In a case study, agricultural scientists captured real-world data from various test samples to demonstrate the potential of snapshot hyperspectral imagers. Data was collected using Living Optics' video-rate snapshot hyperspectral camera. The device simultaneously produces two axially aligned outputs: 2048 x 2432 RGB image and 4384 evenly sampled hyperspectral data points. Two CMOS sensors produced RGB and hyperspectral images at up to 30 frames per second. The snapshot hyperspectral image is generated using a coded aperture-based compressive technique. The resulting hyperspectral data comprises 96 bands covering the visible and NIR spectra (440 – 900 nm). To monitor the health of grassland ecosystems, drought test samples were collected and subsequently imaged from three treatment plots: a control with no experimental treatment applied; a plot in drought condition with 50% of precipitation intercepted by rain shelters; and a section with irrigations in which the precipitation collected from the drought plot is sprayed onto this plot through a sprinkler system. From the plots, the scientists used the Normalized Difference Vegetation Index (NDVI) — a metric that is used to quantify the greenness of vegetation to understand its abundance and distribution. NDVI values are calculated using two bands, one in the red and one in the NIR: In satellite and airborne planetary surface mapping, NDVI returns a score between -1 and 1, with negative values (red band intensity > than the NIR) corresponding to water; values close to zero (NIR and red bands exhibiting similar intensities) corresponding to barren areas with little vegetation; low positive values (NIR band more intense than the red band) representing grassland; and high positive values (NIR band significantly more intense than the red band) associated with tropical rainforests. The NDVI map for the three study samples is shown in Figure 3. Chlorophyll levels were calculated from lettuce leaf samples using the Living Optics HSI (hyperspectral imaging) camera and compared to a commercially available handheld chlorophyll meter, known as a SPAD meter. Data was taken from regions of interest across different leaves and different parts of the plant to verify that a variation in chlorophyll levels was observed. Figure 3. (left) control plot; (middle) drought treatment plot; and irrigation treatment plot calculated from their resolution-enhanced data cubes. The Normalized Difference Vegetation Index (NDVI) wavebands used were 660-680 nm (red) and 790-810 nm (NIR). Courtesy of Living Optics. The SPAD meter calculated a chlorophyll metric from the transmission of light at two wavelengths, 650 nm (red) and 940 nm (NIR). Ten measurements were taken within each leaf’s region of interest using the SPAD meter and then compared to a proposed chlorophyll metric based on hyperspectral data (Figure 4). The shaded regions in plot a indicate the wavebands used to calculate the chlorophyll metric from the Living Optics camera data. Plot b shows good agreement between the proposed chlorophyll metric and the SPAD for the range measured. Figure 4. (left) reflectance spectra taken with the Living Optics camera; (right) computed chlorophyll metric compared to SPAD values. Courtesy of Living Optics. Expanding applications From the study results, snapshot video rate hyperspectral imaging cameras have the potential to take spectral imaging processes beyond the laboratory and into an array of environments for real-time live analysis. And further technological improvements are now enabling uses in/for new applications. Advances in automated tooling, for example, are allowing users to convert radiance data into reflectance estimates of plant surfaces. Expanding this field could help to enable widespread technology adoption. These tooling advances are being evaluated under a range of viewing geometries and lighting conditions to establish their efficacy for multiple use cases. At the same time, work is underway elsewhere to investigate biodiversity and plant species migration. This is done by estimating the vegetation composition of peatland environments using automatic classification based on hyperspectral data. In one case, in-field analysis conducted within vineyards and orchards for improved harvesting focuses on quantifying the potential gains in yield estimates and fruit quality. This is accomplished by tailored picking based on fruit sugar distribution, as well as providing platforms for data-based automated picking systems. With these initiatives presently underway, widespread market adoption of snapshot hyperspectral imaging for industries such as agriculture and ecology will require spectral data (or the insights derived from spectral data) to be available and understandable for non-experts. Users must be able to quantify nutrient deficiencies and health maps rather than trying to break down spectral index images. This requires faster data analysis in the field and less user interaction with the raw data. These advances are made possible by translating the technical aspects of hyperspectral imaging through automated data processing. Meet the authors: Steve Chappell is CTO & Co-Founder of Living Optics. A technologist with experience in start-up, scaleup, and enterprise technology delivery, Chappell is a chartered engineer with a foundation in scientific instruments, precision electromechanical equipment, embedded electronics, and computing; email: steve@livingoptics.com Daniel Pearce heads the applications team at Living Optics. He is a physicist with 20 years of experience in the field of spectroscopy and laboratory-based paint and coatings characterization. Pearce holds an honors degree in physics and computer science from the University of Warwick; email: dan@livingoptics.com