A team at Stanford University has developed a device that that enables lidar functionality with ordinary CMOS image sensors. The device makes it possible to obtain 3D data from technology that, on its own, is capable of seeing only in two dimensions. The team envisions the technology finding use in applications such as self-driving cars, drones, and extraterrestrial rovers.
Measuring the distance between objects with light is currently possible with lidar systems, which send out laser beams and measure returning light to calculate distance, speed, and trajectory. “Existing lidar systems are big and bulky, but someday, if you want lidar capabilities in millions of autonomous drones or in lightweight robotic vehicles, you’re going to want them to be very small, very energy efficient, and offering high performance,” said Okan Atalar, a doctoral candidate in electrical engineering at Stanford and first author of a corresponding paper.
The lab-based prototype lidar system introduced by a Stanford University research team was used to capture megapixel-resolution depth maps using a commercially available digital camera. Courtesy of Andrew Brodhead.
One approach to add 3D imaging to standard sensors is achieved by adding a light source and a modulator that turns the light on and off millions of times every second. By measuring the variations in the light, engineers can calculate distance. However, existing modulators require such large amounts of power that they become impractical for everyday use.
The solution proposed by the Stanford team is a simple acoustic modulator composed of a thin wafer of lithium niobate coated with transparent electrodes. Lithium niobate is piezoelectric, meaning that when electricity is introduced through the electrodes, the crystal lattice at the heart of its atomic structure changes shape. It vibrates at high, predictable, and controllable frequencies, and, as it vibrates, it strongly modulates light. With the addition of polarizers, the new modulator effectively turns light on and off several million times a second.
“What’s more, the geometry of the wafers and the electrodes defines the frequency of light modulation, so we can fine-tune the frequency,” Atalar said. “Change the geometry and you change the frequency of modulation.”
The piezoelectric effect creates an acoustic wave through the crustal that rotates the polarization of light in desirable tunable and usable ways. Then a polarizing filter is carefully placed after the modulator that converts this rotation into intensity modulation — making the light brighter and darker — effectively turning the light on and off millions of times per second.
“While there are other ways to turn the light on and off,” Atalar said, “this acoustic approach is preferable because it is extremely energy efficient.”
The technology can be integrated into a proposed system that uses off-the-shelf cameras, such as those used in cellphones and DSLRs.
Atalar and adviser Amin Arbabian, associate professor of electrical engineering and the project’s senior author, believe that it could become the basis for a new type of compact, low-cost, energy-efficient lidar — “standard CMOS lidar,” as they call it — that could find its way into myriad applications. To demonstrate its broad compatibility, the team built a prototype lidar system on a lab bench that used a commercially available digital camera as a receptor.
The team reported that its prototype captured megapixel-resolution depth maps, while requiring small amounts of power to operate the optical modulator. With additional refinements, Atalar said the team reduced the energy consumption by at least 10× the already low threshold it reported in the paper. The researchers believe that energy reduction of several hundred times is within reach.
The research was published in Nature Communications (www.doi.org/10.1038/s41467-022-29204-9).