A lidar-based AR head-up display allows drivers to “see through” objects to alert of potential hazards without distraction. Researchers from the University of Cambridge, the University of Oxford, and University College London (UCL) developed the technology, which uses lidar to create ultrahigh-definition holographic representations of road objects.
Those objects are then projected directly to the driver’s eyes, contrasting with the 2D windshield projections used in most head-up displays.
“Our results show that the 2D windscreen projections could distract the driver as they appear in a small area of the windscreen and the driver still must shift the gaze from the road toward the windscreen,” Jana Skirnewskaja, lead author of a study describing the technology and a Ph.D. candidate at Cambridge University told Photonics Media. “In the case of the 3D augmented reality optical setup, the holograms are directly projected into the driver’s eyes so that the pupil acts as a lens to focus the projected holographic objects on the road matching the distance and the size of the real-life objects.”
An image of a tree based on lidar data (left). The same image converted to a hologram (right). Courtesy of Jana Skirnewskaja.
The setup, she said, consists of a helium-neon (HeNe) laser, linear polarizers, a half-wave plate, an ultrahigh-definition spatial light modulator, and convex and concave lenses. It takes input from a lidar sensor that feeds information into algorithms, which then transmit relevant data to the optical system.
Using lidar, the researchers scanned Malet Street, a busy area on the UCL campus in central London. Co-author Phil Wilkes, a geographer who usually uses lidar to scan tropical forests, scanned the entire street with a technique called terrestrial laser scanning. Millions of pulses were sent out from multiple positions along Malet Street to create a 3D model.
“This way, we can stitch the scans together, building a whole scene, which doesn’t only capture trees, but cars, trucks, people, signs, and everything else you would see on a typical city street,” Wilkes said. “Although the data we captured was from a stationary platform, it’s similar to the sensors that will be in the next generation of autonomous or semi-autonomous vehicles.”
When the 3D model of Malet Street was completed, the researchers transformed various objects on the street into holographic projections. The lidar data, in the form of point clouds, was processed by separation algorithms to identify and extract the target objects. Another algorithm converted the target objects into computer-generated diffraction patterns. These data points were then sent to the optical setup.
“With the help of an algorithm, we are able to project several layers, hence several holographic objects into the driver’s eyes, creating augmented reality in the driver’s field of view on the road,” Skirnewskaja told Photonics Media.
The holographic projection the driver sees is true to the scale and the position of the represented real object on the street. For example, a hidden street sign would appear as a holographic projection relative to its actual position behind the obstruction, acting as an alert mechanism.
The researchers plan to refine their system by personalizing the layout of the head-up displays. They have created an algorithm capable of projecting several layers of different objects that can be freely arranged in the driver’s vision space. For example, in the first layer, a traffic sign at a farther distance can be projected at a smaller size. In the second layer, a warning at a closer distance can display the sign at a larger size.
“Currently we are testing the technology within a car setting. We intend to experiment with different light sources to decrease the size of the optical setup and reduce the number of lenses by implementing an advanced algorithm that creates virtual lenses,” Skirnewskaja said. “This will allow us to practically fit the optical setup into the car environment.”
The research was published in Optics Express (www.doi.org/10.1364/oe.420740).