Search
Menu
Opto Diode Corp. - Opto Diode 10-24 LB

Sensor Balancing Act for UAVs

Facebook X LinkedIn Email
Spectral imaging sensors, lidar, and a thermal or RGB camera are often integrated into one airborne package to best acquire streams of information about the surrounding environment.

CHRIS VAN VEEN, HEADWALL PHOTONICS

Commercial applications such as crop disease detection, environmental monitoring, geological exploration and infrastructure inspection are prime examples of how the confluence of drones and sensing instruments is evolving globally. On-demand data collection, sometimes several times daily, is the key advantage of using drones as host platforms for sensing instruments. In practical terms, a consistent supply of actionable data is available much more affordably than from manned aircraft or satellites. Indeed, the entire data cycle from collection to analysis can be compressed from several days to a few hours using drones.

This application combines VNIR hyperspectral with a thermal camera and a highly accurate GPS/IMU. The Ethernet switch moves the data from the thermal camera and the GPS/IMU into the Nano-Hyperspec VNIR with its 480-GB solid-state storage.

This application combines VNIR hyperspectral with a thermal camera and a highly accurate GPS/IMU. The Ethernet switch moves the data from the thermal camera and the GPS/IMU into the Nano-Hyperspec VNIR with its 480-GB solid-state storage. Courtesy of Headwall Photonics.

Spectral imaging sensors include multispectral and hyperspectral, with the former providing a small number of wide bands (sometimes with gaps in between) and the latter providing hundreds of contiguous narrow bands. There is a place for both, but the focus must be on specificity and discrimination to classify accurately between certain minerals in the SWIR range from 1000 to 2500 nm, or between healthy and diseased foliage in the visible near-IR (VNIR) range from 400 to 1000 nm.

The more bands of spectral data the sensor collects, the more accurate the decision-making becomes. Sometimes the spectral range of interest is well-defined beforehand; in such cases a multispectral sensor might well be sufficient. But when the natural phenomenon of interest is not clearly defined, or when there is a similarity between one environmental feature and the next, there is an advantage to having hundreds of narrow spectral bands. In short, hyperspectral sensors afford a higher level of classification than multispectral ones.

This UAV features a fully integrated multirotor remote-sensing platform with a micro-hyperspectral (SWIR 900- to 2500-nm) camera, HyperCore data hub and a full GPS/IMU (inertial measurement unit).

This UAV features a fully integrated multirotor remote-sensing platform with a micro-hyperspectral (SWIR 900- to 2500-nm) camera, HyperCore data hub and a full GPS/IMU (inertial measurement unit). Courtesy of Headwall Photonics.

Very often, users focus an inordinate amount of attention on choosing their unmanned aerial vehicle (UAV). But in relative terms, it is simply a platform on which spectral imaging sensors and other instruments are mounted. The cost of a UAV and associated parts (such as a gimbal) represent only 10 to 20 percent of the entire solution value. Choosing between a fixed-wing or a multirotor UAV is certainly an important exercise, but choosing the instruments and integrating them sensibly is even more important. Once the mission is defined and the payload set, users can go about selecting the most sensible UAV. Altitude, acreage, field of view, climate, flight duration and cost are factors worth understanding beforehand.

Common instrument pairings

UAVs need to carry a complementary range of instruments to carry out most remote-sensing missions. The imaging sensor can be multispectral or hyperspectral, depending on the number of bands required and the spectral and spatial resolution needed. These sensors collect spectral image data either across the VNIR range or across the SWIR range. In some cases, users want to see across both ranges, from 400 nm all the way to 2500 nm in a single instrument having coregistered pixels.

Once the spectral imaging sensor is selected, two instruments can play crucial roles in ensuring the quality of the collected data: The first is a global-positioning system combined with an inertial measurement unit (GPS/IMU). These instruments are small and light, allowing them to be positioned as needed on the UAV. The GPS/IMU keeps track of the geographical coordinates during flight and is a key source of data during post-processing of the images. Detecting crop diseases on a tree-by-tree or leaf-by-leaf basis demands very precise positioning data, which the GPS/IMU provides with the help of strategically positioned antennas on the UAV.

The second instrument is lidar, which uses pulsed lasers to measure the distance from the UAV to ground features such as treetops, hills, bodies of water and so on. Lidar developers are adopting the “small-and-light” trend for payload efficiency with small units that can be positioned as needed on the UAV; the lidar unit should be integrated so that it “looks” in the same direction as the spectral imaging sensor. This ensures a more accurate synthesis between the respective data streams. The data product of a lidar system is a point cloud, a three-dimensional array of points in space that correspond to all of the features in the scene being scanned. This point cloud can be converted into a digital elevation map, which is used to generate georectified imagery from line-scan spectral imaging sensors. Particularly in locations where accurate digital elevation maps are not available, the data from lidar can significantly improve the quality of airborne spectral imagery.

Thermal and RGB cameras

Additional complementary instruments for remote-sensing missions can include thermal and RGB framing cameras and fiber optic downwelling irradiance sensors (FODIS). The latter uses an optical fiber to collect the spectrum of radiation from the sun, which varies with time of day, cloud cover, angle of the UAV and other factors. In addition, radiometric calibration of the spectral imaging sensor is a vital part of the integration process. Having accurate measurements of the upwelling radiance (the light collected by the imaging sensor) and downwelling irradiance (measured by the FODIS) allows the accurate calculation of the reflectance spectrum for each pixel in the scene being imaged. A FODIS can be attached to the top of any UAV and can be a single fiber or a whole array.

The Nano-Hyperspec sensor collects 270 spectral bands across the 400- to 1000-nm visible near-IR (VNIR) region and stores this data onboard its own 480-GB solid-state drive.

The Nano-Hyperspec sensor collects 270 spectral bands across the 400- to 1000-nm visible near-IR (VNIR) region and stores this data onboard its own 480-GB solid-state drive. Lidar collects terrain/elevation information, while additional data is collected by thermal and RGB sensors. Courtesy of Headwall Photonics.


Meadowlark Optics - Wave Plates 6/24 MR 2024
In total, several complementary but different instruments can comprise a remote-sensing solution aboard a UAV: a spectral imaging sensor, an accurate GPS/IMU, lidar, a FODIS to correct for solar variability, and a thermal or RGB camera. Selecting a GPS/IMU must be done carefully because the accuracy from one to another can vary markedly. Lidar is used occasionally, as are FODIS, thermal and RGB, but when everything is paired, the data streams from all instruments need to be synthesized along with the image data. This represents the data storage/processing side of the payload. Fortunately, small and lightweight data-fusion hubs such as Headwall’s HyperCore tie everything together nicely with a range of input ports and ample storage for the data coming from each instrument.

The HyperCore data hub synthesizes and stores the data streams coming from the SWIR hyperspectral sensor, the GPS/IMU, and other connected instruments.

The HyperCore data hub synthesizes and stores the data streams coming from the SWIR hyperspectral sensor, the GPS/IMU, and other connected instruments. Courtesy of Headwall Photonics.

Since the amount of data collected from these instruments is vast (easily more than 1 GB for a small mission), streaming data from the UAV to the ground is presently not feasible. A hyperspectral sensor providing coverage across more than 300 bands is collecting 100 times what a typical three-band RGB sensor does. A 2000-frame VNIR data cube comprises nearly 700 MB of data. Using a frame rate of 250 Hz, this leads to a data rate of nearly 84 MB/s. It would take a very powerful and expensive Wi-Fi network on both ends (airborne and ground station) to “stream” this amount of data, which doesn’t factor in contributions from lidar, GPS/IMU, thermal and so on. The other risk is lost data if the UAV passes beyond the range of the network. Consider, too, that the UAV would need to be equipped with its own data transmitter and antenna array, which would consume payload budget and power itself. However, while streaming remote-sensing data isn’t feasible now, work is ongoing to make it so.

In conjunction with a highly accurate GPS/IMU, post-processing tasks such as orthorectification help ensure straight lines and square pixels. The normal airborne variables such as roll, pitch and yaw affect a UAV just as they would a Boeing 777. With respect to collecting precise image data, these variables need to be accurately measured for use during post-processing. Powerful application software such as Headwall’s Spectral-View takes care of not only orthorectification but also stitching together multiple data cubes in a “mosaic” fashion. The software is both intuitive and powerful, since its function is to manage hyperspectral data cubes that are often several gigabytes in size. Since the software is Windows-compatible, the processing requirements are in line with practically all of today’s desktop computers having sufficient speed, memory and storage. The quality of collected spectral image data depends on factoring in variables such as roll, pitch and yaw, as well as latitude, longitude and altitude. For every remote-sensing mission — whether from a UAV or a manned aircraft — every variable with its likely effect on the data must be understood. During integration, one consideration is whether to implement a stabilizing gimbal. For multirotor craft, a stabilizing gimbal precisely reduces the effects of UAV roll, pitch and yaw on the imaging sensor attached to it. This helps ensure “square” pixels once the data is downloaded and analyzed. Obviously, a gimbal adds to the payload budget and is another device that draws power. But for some situations where multirotor UAVs are used, a custom stabilizing gimbal is beneficial. On the other hand, a highly precise GPS/IMU paired with orthorectification during post-processing can yield comparable if not better results. Minus the gimbal, payload budget is saved and longer battery life and flight durations are possible.

What the data reveals

The combination of full hyperspectral imagery along with a lidar point cloud is an extremely powerful one, yielding an extraordinary view of the scene. RGB and IR (thermal) sensors collect a panoramic view of a scene while hyperspectral sensors collect image data on a frame-by-frame basis. Together, the data streams are complementary. For example, a UAV can fly at night across a prescribed flight path to collect important thermal imagery. Later, during daylight conditions, the same flight profile collects hyperspectral data. The GPS/IMU allows the user to marry these data streams into an overall representation of the field of view. Understanding the synthesis of these data streams is a critical part of integration, and knowing which instruments are necessary and where to place them on the craft is also vital.

The data collected from each instrument improves the body of knowledge for crop science, environmental monitoring, geological research and other applications. Foods can become more plentiful and wholesome thanks to the ability to pinpoint potential disease conditions on a per-plant basis. The ability to target mineral deposits nondestructively through UAV-based SWIR can make the mining industry more efficient using small and nimble UAVs.

When using more powerful UAVs with higher payload capacities, two other imaging sensor types can be deployed. The first is a combined VNIR-SWIR hyperspectral instrument that covers the 400- to 2500-nm spectral range with coregistered or coaligned pixels within a single enclosure. The second is a mission-specific chlorophyll fluorescence sensor that collects important data within the narrow 670- to 780-nm range for evaluating environmental conditions (important indices within the Oxygen-A and Oxygen-B bands). Sensors of this type require higher-powered UAVs with large payload capacities, which are still more cost-effective to deploy and tactically more maneuverable when compared with manned aircraft or satellites.

Meet the author

Christopher Van Veen is marketing communications manager at Headwall Photonics in Bolton, Mass. He holds a bachelor’s degree from Bentley University and an MBA from Southern New Hampshire University.


Published: January 2018
Glossary
hyperspectral imaging
Hyperspectral imaging is an advanced imaging technique that captures and processes information from across the electromagnetic spectrum. Unlike traditional imaging systems that record only a few spectral bands (such as red, green, and blue in visible light), hyperspectral imaging collects data in numerous contiguous bands, covering a wide range of wavelengths. This extended spectral coverage enables detailed analysis and characterization of materials based on their spectral signatures. Key...
multispectral imaging
Multispectral imaging is a technique that involves capturing and analyzing images at multiple discrete spectral bands within the electromagnetic spectrum. Unlike hyperspectral imaging, which acquires data across a continuous range of wavelengths, multispectral imaging is characterized by capturing information at several specific, predefined bands. This allows for the extraction of spectral signatures and information from different parts of the spectrum. Key aspects of multispectral imaging...
remote sensing
Remote sensing is a method of data collection and observation where information about objects, areas, or phenomena on Earth's surface is gathered from a distance, typically using sensors onboard satellites, aircraft, drones, or other platforms. This technique enables the monitoring and analysis of Earth's surface and atmosphere without direct physical contact. Remote sensing systems capture electromagnetic radiation (such as visible light, infrared, microwave, or radio waves) reflected or...
positioning
Positioning generally refers to the determination or identification of the location or placement of an object, person, or entity in a specific space or relative to a reference point. The term is used in various contexts, and the methods for positioning can vary depending on the application. Key aspects of positioning include: Spatial coordinates: Positioning often involves expressing the location of an object in terms of spatial coordinates. These coordinates may include dimensions such as...
lidar
Lidar, short for light detection and ranging, is a remote sensing technology that uses laser light to measure distances and generate precise, three-dimensional information about the shape and characteristics of objects and surfaces. Lidar systems typically consist of a laser scanner, a GPS receiver, and an inertial measurement unit (IMU), all integrated into a single system. Here is how lidar works: Laser emission: A laser emits laser pulses, often in the form of rapid and repetitive laser...
Spectral imaging sensorsRGB cameraSensors & DetectorsImaginghyperspectral imagingmultispectral imagingOpticsremote sensingUAVpositioninglidarfiber opticsChris Van VeenHeadwall PhotonicsdronesFeatures

We use cookies to improve user experience and analyze our website traffic as stated in our Privacy Policy. By using this website, you agree to the use of cookies unless you have disabled them.