Search
Menu
Lambda Research Optics, Inc. - Limited Time Offer

Metasurface Waveguide Could Lower AR Losses and Improve Image Quality

Facebook X LinkedIn Email
ROCHESTER, N.Y., Nov. 25, 2025 — Augmented reality (AR) waveguide displays often exhibit low efficiency caused by losses from multiple interactions between the incoming light and the input port, also known as the in-coupler, where the image enters the glass. These losses limit system brightness and clarity.

To ensure bright, uniform visual output from AR devices, a team at the University of Rochester developed an in-coupler featuring three specialized zones made with metasurface materials. Each zone is optimized for high efficiency.

The multizone, metasurface in-couplers could help bring AR devices steps closer to practical, everyday use in education, entertainment, engineering, medicine, and other fields.

“Many of today’s AR headsets are bulky and have a short battery life with displays that are dim and hard to see, especially outdoors,” professor Nickolas Vamivakas, who led the research, said. “By creating a much more efficient input port for the display, our work could help make AR glasses much brighter and more power-efficient, moving them from being a niche gadget to something as light and comfortable as a regular pair of eyeglasses.”

Researchers designed a high-efficiency, multizone metasurface waveguide in-coupler that could improve the brightness and clarity of AR waveguide displays, making them more practical for everyday use. Courtesy of the University of Rochester/ J. Adam Fenster.
Researchers designed a high-efficiency, multizone metasurface waveguide in-coupler that could improve the brightness and clarity of AR waveguide displays, making them more practical for everyday use. Courtesy of the University of Rochester/ J. Adam Fenster.

The researchers fabricated the metasurfaces using electron beam lithography and atomic layer deposition. They designed metasurface patterns to capture the incoming light efficiently and reduce the amount of light being leaked. They also ensured that the metasurfaces were able to preserve the shape of the incoming light, which is essential for high-quality images.

The design for the in-coupler is based on previous research in which the team theoretically showed that a multizone in-coupler could improve efficiency and image quality in AR devices. The current work translates the team’s idealized, multizone theory into an actual AR component. Throughout the design process, the team was guided by a custom optimization framework that incorporated realistic efficiency values and accounted for suboptimal efficiency sums and material losses.

“This paper is the first to bridge the gap from that idealized theory to a practical, real-world component,” Vamivakas said.

Ultrathin metasurface materials can bend, focus, and filter light in ways that lenses made with conventional materials cannot, and offer greater design and manufacturing flexibility than traditional optics. The use of sophisticated metasurface gratings gave the researchers the design flexibility needed to build three different in-coupler zones. Advanced fabrication methods provided the precision required to create complex, high-aspect-ratio nanostructures.

The researchers tested each metasurface zone individually using a custom-built optical setup. They then tested the fully assembled, three-zone AR device as a complete system, using a similar optical setup to measure the total coupling efficiency across the entire horizontal FOV from -10 degrees to 10 degrees.

PowerPhotonic Ltd. - High Power Laser Diode Optic Solutions MR 12/25

The experimental results validated the metasurface designs and confirmed the viability of the multizone approach under realistic operating conditions. The measurements showed strong agreement with simulations across most of the FOV. The average measured efficiency across the field was 30%, which closely matched the simulated average of 31%.

The one exception was at the edge of the FOV, at -10 degrees, where the measured efficiency was 17% compared to the simulated 25.3%. The researchers attribute this to the design’s high angular sensitivity at that angle and to potential, minor fabrication imperfections.

The team is now working to apply the new metasurface design and optimization framework to other components of the waveguide to demonstrate a complete, high-efficiency, metasurface-based system. Once they achieve this goal, the researchers plan to expand the design from a single color (green) to full-color (RGB) operation. They will also refine the design to improve fabrication tolerance and minimize the efficiency drop at the edge of the FOV.

“This work to improve the in-coupler, a primary source of light loss, is part of a larger project aimed at using metasurfaces to design the entire waveguide system, including the input port, output port, and all the optics that guide the light in between,” Vamivakas said.

Before the technology can be commercialized the researchers will need to demonstrate a fully integrated prototype that pairs the in-coupler with a real micro-display engine and an out-coupler. They must also develop a robust, high-throughput manufacturing process to replicate the complex nanostructures at a low cost.

However, the current study demonstrates the feasibility of metasurface-based in-couplers for waveguide displays and provides an experimentally proven path toward developing in-couplers for high-efficiency AR displays.

Vamivakas believes that the team’s design approach could be extended to displays for other applications, beyond AR.

“While our focus is on AR, this high-efficiency, angle-selective light coupling technology could also be used in other compact optical systems, such as head-up displays for automotive or aerospace applications or in advanced optical sensors,” he said.

The research was published in Optical Materials Express (www.doi.org/10.1364/OME.576634).

Published: November 2025
Glossary
augmented reality
Augmented reality (AR) is a technology that integrates digital information, typically in the form of computer-generated graphics, images, or data, with the real-world environment in real-time. AR enhances the user's perception of the physical world by overlaying or combining digital content onto the user's view of the real world, often through devices like smartphones, tablets, smart glasses, or specialized AR headsets. Key features and principles of augmented reality: Real-time...
virtual reality
Virtual reality (VR) is a computer-generated simulation of a three-dimensional environment or experience that can be interacted with and explored by an individual using electronic devices, such as a headset with a display. VR aims to create a sense of presence, immersing users in a computer-generated world that can be entirely fictional or a replication of the real world. It often involves the use of specialized hardware and software to provide a fully immersive and interactive experience. ...
metasurfaces
Metasurfaces are two-dimensional arrays of subwavelength-scale artificial structures, often referred to as meta-atoms or meta-elements, arranged in a specific pattern to manipulate the propagation of light or other electromagnetic waves at subwavelength scales. These structures can control the phase, amplitude, and polarization of incident light across a planar surface, enabling unprecedented control over the wavefront of light. Key features and characteristics of metasurfaces include: ...
embedded vision
Embedded vision refers to the integration of computer vision technologies into various embedded systems, devices, or machines. Computer vision involves teaching machines to interpret and understand visual information from the world, much like human vision. Embedded vision takes this concept and applies it to systems where the processing occurs locally within the device, as opposed to relying on external servers or cloud-based services. Key components of embedded vision systems include: ...
machine vision
Machine vision, also known as computer vision or computer sight, refers to the technology that enables machines, typically computers, to interpret and understand visual information from the world, much like the human visual system. It involves the development and application of algorithms and systems that allow machines to acquire, process, analyze, and make decisions based on visual data. Key aspects of machine vision include: Image acquisition: Machine vision systems use various...
Research & TechnologyeducationAmericasUniversity of Rochesteraugmented realityvirtual realitywaveguidesImagingDisplaysLight SourcesMaterialsmetamaterialsmetasurfacesOpticsEmbedded Vision3D visionlensesConsumermedicineindustrialmachine visionheadworn displays

We use cookies to improve user experience and analyze our website traffic as stated in our Privacy Policy. By using this website, you agree to the use of cookies unless you have disabled them.