Search
Menu
OSI Optoelectronics - Custom Solutions LB 5/23

Complementary combinations

Facebook X LinkedIn Email
JAKE SALTZMAN, MANAGING EDITOR [email protected]

By its definition, the notion of “sensor fusion” allows for the possibility of combining data from one sensing modality with virtually any other. Motion tracking provides a classic trimodal sensor fusion: The fusion of accelerometric, magnetometric, and gyroscopic data delivers precise information relative to location. In this case, the differences in the measurables that each individual sensing modality tracks are quite subtle.

At the same time, the modalities themselves are highly complementary. In combination, data obtained via each type of sensor delivers the utmost precision in determining orientation.

Sensor fusion can be far less nuanced, at least in terms of the sensors used. In 1997, for example, a paper published by a group from the National Institute of Standards and Technology (NIST) “describes a real-time hierarchical system that combines (fuses) data from vision and touch sensors to simplify and improve the operation of a coordinate measuring machine (CMM) used for dimensional inspection tasks”1. Even the paper’s title characterizes this as a unique combination, owing to its reliance on sensory processing techniques rather than traditional performance measurements to optimize CMM accuracy.

On a scale with “ubiquitous” and “totally unusual” at either end, the fusion of lidar and radar falls somewhere in the middle. In automotive applications, particularly as autonomous mobility pushes toward the mainstream, the use of the two ranging techniques in tandem is already commonplace. Lidars and radars are also commercially relevant systems. And given their capabilities, both support similar applications.

Bristol Instruments, Inc. - 872 Series LWM 10/24 MR

On the other hand (look no further than in past editions of this magazine for proof), it is easy to articulate lidar’s fundamental advantages by making direct, one-to-one comparisons to radar’s shortcomings. And vice versa.

As the authors from Silicon Austria Labs write in this edition’s cover story, the differences between lidar and radar open opportunities for their fusion, in both the logical settings and more novel use cases. In the authors’ work, high-resolution lidar systems and application-durable radar systems operate simultaneously to benefit safety and performance on the roadways as well as in emergency search and rescue.

The two technologies that so easily contrast are, in fact, complementary. It is something many end users likely already know.

One additional quality puts the concept of sensor fusion in a class with many other dynamic processes: its potential to benefit from photonics technology. To this end, as part of the Horizon project CoRaLi-DAR, the Silicon Austria Labs team is developing a lidar-radar sensor fusion using an innovative platform based on electro-photonic integrated circuits.



1. M. Nashman et al. (1997). A unique sensor fusion system for coordinate measuring machine tasks. SPIE Internat’l Symp. on Intelligent Systems & Advc. Manufact. Session: Sensor Fusion & Decentralized Control in Autonomous Robotic Systems.

Published: February 2025
Editorial

We use cookies to improve user experience and analyze our website traffic as stated in our Privacy Policy. By using this website, you agree to the use of cookies unless you have disabled them.