Search
Menu
Meadowlark Optics - Wave Plates 6/24 LB 2024

Army Robot Detects, Shares Environmental Changes with Human Teammate in Real Time

Facebook X LinkedIn Email
The robotic component of a human-robot team designed by the U.S. Army is capable of detecting physical changes in 3D and sharing the information it collects with a human in real time. Augmented reality enables the delivery of information, allowing the human recipient to assess the information and promptly determine action steps.

A team of scientists from the U.S. Army demonstrated the relationship in a structured real-world environment by pairing a small, autonomous, lidar-equipped mobile ground robot with a human teammate outfitted with AR glasses. As the robot patrolled its surroundings, it compared, in real time, its current and previous readings to detect changes. Any changes or perceived abnormalities are instantly displayed in the eyewear, allowing for human interpretation.

These include anything from camouflaged enemy soldiers to improvised explosive devices, said Christopher Reardon, a researcher in the U.S. Army Combat Capabilities Development Command’s Army Research Laboratory.

The researchers tested with lidar sensors of varying resolutions on the robot to determine the best fit for their application. When the robot shared the measurements and changes that the different resolution sensors detected, the human teammate was able to interpret information from both the lower- and higher-resolution lidars. Depending on the size and magnitude of the changes with which the scientists tested, the system could use lighter, smaller, faster, and less-expensive sensors.

The capability could be incorporated into future mixed-reality interfaces, such as the Army’s Integrated Visual Augmentation System (IVAS) goggles. “Incoporating mixed reality into soldiers’ eye protection is inevitable,” Reardon said. “This research aims to fill gaps by incorporating useful information from robot teammates into the soldier-worn visual augmentation ecosystem while simultaneously making the robots better teammates to the soldier.”

The two robots used in the experiments are identically equipped, with the exception of Velodyne VLP-16 LiDAR (left) and Ouster OS1 LiDAR (right). Courtesy of the U.S. Army.
The two robots used in the experiments are identically equipped, with the exception of Velodyne VLP-16 lidar (left) and Ouster OS1 lidar (right). Courtesy of the U.S. Army.
Real-world testing of the human-robot team is in contrast to much of the existing academic research in the use of mixed-reality interfaces for human-robot teaming, which relies on external instrumentation (typically in a lab) to manage the calculations necessary to share information between the teammates. Many engineering efforts to provide humans with mixed-reality interfaces do not examine teaming with autonomous mobile robots, Reardon said.

DataRay Inc. - ISO 11146-Compliant

The research is part of the lab’s ongoing work that explores ways to provide contextual awareness to autonomous robotic ground platforms in maneuver and mobility scenarios. The work is part of the Artificial Intelligence for Mobility and Maneuver Essential Research Program.

Participating researchers also join international coalition partners in the Technical Cooperation Program’s Contested Urban Environment Strategic Challenge (TTCP CUESC) events to test and evaluate human-robot teaming technologies.

Future studies will explore how to strengthen teaming, with specific focus on increasing human interactivity with the robot’s detected changes. This will provide added information in real time about the context of the situation in which the robot is delivering information, and human versus natural environment changes, or false positives, Reardon said.

Changes and enhancements would also improve the autonomous context understanding and reasoning capabilities of the robotic platform, such as by enabling the robot to learn and predict the types of changes that constitute a threat.

The research entitled “Enabling Situational Awareness via Augmented Reality of Autonomous Robot-Based Environmental Change Detection” was a collaboration between the U.S. Army and the University of California, San Diego. The collaborators published the research at the 12th International Conference on Virtual, Augmented, and Mixed Reality, part of the International Conference on Human-Computer Interaction.

Published: August 2020
Glossary
lidar
Lidar, short for light detection and ranging, is a remote sensing technology that uses laser light to measure distances and generate precise, three-dimensional information about the shape and characteristics of objects and surfaces. Lidar systems typically consist of a laser scanner, a GPS receiver, and an inertial measurement unit (IMU), all integrated into a single system. Here is how lidar works: Laser emission: A laser emits laser pulses, often in the form of rapid and repetitive laser...
US ArmyAmericasuniversity of California - San DiegolidardefenseARvisionVision & ControlSensors & Detectorssensorsrobothuman-robot collaborationmixed reality3D3D AR3D visionThe News Wire

We use cookies to improve user experience and analyze our website traffic as stated in our Privacy Policy. By using this website, you agree to the use of cookies unless you have disabled them.