Photonics Shapes the Worlds of Augmented and Virtual Reality
HANK HOGAN, CONTRIBUTING EDITOR
Replacing smartphones with smart glasses isn’t practical … yet. In the meantime, advancements in photonics are helping to expand the use of smart glasses in educational, medical, military, and industrial settings.
According to technologists and end users, what’s needed to sustain and accelerate this growth are brighter light sources, more efficient delivery optics, and better eye-tracking sensors. Also on the wish list are systems with lower weight, more compact form factors, and reduced cost. These requirements apply both to virtual reality (VR) headsets that immerse users in a digital world and to augmented reality (AR) glasses or displays that superimpose digital data over real-world scenes.
An interactive game board as seen through AR glasses. Patterned light projected from the glasses bounces off the board, which acts as a retroreflector, to enable users to perceive a virtual image on the physical board. Courtesy of Tilt Five.
Research by Ningcheng “Peter” Li and Brian Park, two interventional radiologists at Oregon Health & Science University, offers a sneak peek into the future benefits of AR. At the April meeting of the American Roentgen Ray Society, Li presented results from a multisite study into the use of AR to guide radiologists and surgeons during surgical operations.
Practitioners conventionally rely on two-dimensional displays of three-dimensional CT and MRI scans of their patients. But AR could deliver the information in a much more intuitive way.
“We want to see that 3D model directly on top of the patient, completely aligned with the patient. So, every time we cut, every time we put a needle through a structure, we know what’s behind the structure. We know what’s around the structure,” Li said.
Earlier research by the group revealed that AR technology could help to improve the outcomes for particular procedures by cutting the number of steps or decreasing
exposure to x-ray radiation during an operation. The group’s more recent studies have indicated that AR technology could help to speed up various patient registration techniques, which correlate the reference position of a virtual 3D data set gathered by medical imaging with the reference position of the patient.
In their latest studies, Li’s group used Microsoft’s HoloLens 2 headset. In a February keynote at SPIE’s Advanced Lithography Conference, Gregory McIntyre, who managed the development of HoloLens displays, said these devices contain a light engine along with projection optics that together create the virtual image. Similar components are found in VR headsets.
A virtual tour
A see-through AR headset such as the HoloLens 2 adds a combiner that superimposes virtual image data onto a transparent screen through which users can view the real world. A pass-through AR device, in contrast, combines computer-generated virtual images with the live feed from a digital camera and shows both sets of
images on the same display, such as on the screen of a tablet, smartphone, or something similar to a VR headset equipped with a forward-facing camera. The pass-through approach, proponents claim, offers a wider field of view and costs less. See-through AR, however,
has major backers.
Interventional radiologist Brian Park of Oregon Health & Science University uses a HoloLens 2 AR headset to help guide his movements during a microwave ablation procedure (top). The headset superimposes 3D medical image data about the patient onto the patient’s physical body, allowing doctors to operate more quickly and precisely to improve patient outcomes (bottom). Courtesy of Peter Li and Brian Park/OHSU.
Both AR and VR systems rely on light engines, which generate images for display. High resolution and brightness are important requirements for these components, eMagin CEO Andrew Sculley said. The direct patterning technique that the company uses to make its active-matrix organic LED (AMOLED) microdisplays produces red, green, and blue light-emitting layers on the substrate to eliminate the need for color filters. The result is a brighter display suitable for high-resolution applications, he said.
When it comes to light engine resolution, he said that a large field of view (for example, over 100°), with at least 40 pixels per degree, can produce a truly immersive experience for viewers. Such a light engine would require more than 4000 pixels on a side, while still being small and light enough for an AR/VR headset.
Light engines require a certain baseline intensity to account for losses along the optical path and, in the case of see-through AR systems, to overcome ambient light. But the ability to modulate and control brightness is just as relevant. Rapid head or eye movements can result in a smeared AR/VR image for viewers. “The only way to avoid that is that I have the light from the display on for only a portion of the frames,” Sculley said.
The company eMagin eliminates the color filters normally used in microdisplays (left) by employing a direct patterning technique to produce active-matrix OLED microdisplay structures (right), resulting in brighter displays that are suitable for high-resolution AR/VR applications. EML: emitting layer. Courtesy of eMagin.
For many headsets, this typically means that light is produced only 10% of the time. The light engine increases output to make certain images persist for the viewer. However, not all pass-through AR headsets use microdisplays for light engines. Startup Tilt Five developed a headset, for instance, that sends the light from red, green, and blue LEDs through a liquid-crystal-on-silicon array and a pinhole.
“Light is emitted from the glasses and goes down to this retroreflector on the game board, which has this film on it,” said Jeri Ellsworth, CEO of Tilt Five. “[The] light does a 180° rotation and comes directly back to the users.”
A 2K × 2K microdisplay image (top) illustrates the immersive experience possible in VR headsets (bottom). Courtesy of eMagin.
This optical path design makes it possible for a physical object, such as a model castle on a game board, to appear in focus along with a projected digital image, such as a virtual dragon attacking the castle. Tilt Five’s headset is still under development, but it is expected to become commercially available this year. Its field of view spans 110°, which Ellsworth said has drawn interest from game companies and other users. One commercial application under discussion would use the headset as a collaboration tool for product design.
Lasers are another candidate as a
light source for AR/VR headsets. But Ellsworth said costs for these components — particularly green lasers — need to come down before they are suitable for use in a consumer-priced item.
Augmentations
A see-through AR headset adds a combiner that superimposes virtual image data onto a transparent screen through which users can view the real world. See-through AR implementations need a combiner to superimpose virtual images in a way that makes the images appear acceptably genuine to the viewer. Simple prisms were initially used for this component, though optical waveguides have recently gained more favor.
Dispelix, a spinoff of the Finish
research institute VTT, fabricates
diffraction-based waveguides using
e-beam lithography to achieve 20-nm linewidth features in a single layer rather than the three layers that had until recently represented the state of the art, Dispelix CEO Antti Sunnari said.
For AR/VR designers, this advancement means lighter and smaller headsets because only one waveguide is needed instead of three. It also improves image quality, he said, because multiple waveguides can lead to ghost images arising from light leaking from one to another.
Dispelix’s current waveguides have found their way into the latest AR
devices, such as smart glasses and headup displays. But the company has plans to make smaller waveguides that have a larger field of view and the ability to handle brighter light engines. “We have
a new product coming with laser functionality,” Sunnari said. His company is a founding member of the Laser Scanning for Augmented Reality (LaSAR) Alliance.
Eye tracking is another photonics
technology that is contributing advancements to AR/VR. By tracking exactly where a user’s gaze is directed, AR/VR systems can selectively increase resolution for the corresponding section of a virtual display by using a technique
known as foveated rendering. Eye tracking can also allow a person to issue commands by simply gazing at a virtual control for a fixed period of time.
Eye trackers from Argus Science employ a typical architecture. The devices rely on two near-IR cameras with corresponding light sources mounted on the interior of a headset to illuminate and monitor a user’s eyes. While frame rates above 60 Hz are sufficient to capture rapid eye movements, known as saccades, Argus Science products are designed to be used in research and other applications, and thus they are able to capture 180 fps. Finding sensors small enough, fast enough, and sensitive enough in the near-IR was a challenge, said Robert
Wilson, general manager for Argus. Higher sensitivity would enhance eye tracking technology by enabling the capture of even smaller eye movements, or microsaccades.
Eye-tracking could also be made more universally applicable to a wider variety of eye types and end users. Some users, for example, have droopy eyelids that cover the cornea, or users may have other conditions that make eye tracking difficult.
“If something works on 90% or 95% of the subjects 90% or 95% of the time, that’s great for doing good research,”
Wilson said. “But if something is going to be used by a consumer, that needs to be 99.9%.”
Waveguides can be fabricated with form factors similar to eyeglass lenses to combine virtual and real images in see-through augmented reality applications. Courtesy of Dispelix.
Eye-tracking technology is a critical component to improving the experience of AR/VR headsets while reducing power consumption. IR-sensitive cameras directed at the user’s eyes (upper corners) track
the direction of gaze and correlate it with the scene captured by forward-looking cameras as the cameras image real-world activities. Courtesy of Argus Science.
Advancements may be on the way that allow eye tracking technology to adapt to a broader consumer base, said Andrew Duchowski, a professor of visual computing at Clemson University. One possible solution could be to remove cameras from the mix, along with their associated processing.
“That could potentially revolutionize the whole ballgame,” Duchowski said.
Startup AdHawk Microsystems, for instance, developed one such system that scans a lower-power light beam across the eye thousands of times per second and captures reflected light with a simple sensor. The company said that, in addition to offering a lower-power solution, its system tracks eye movements with better than 1° accuracy, and that the system samples at 500 Hz — significantly higher than the HoloLens 2 headset, which
operates in the tens of hertz range.
These and other photonics advancements are steadily evolving AR/VR technology to be smaller, lighter, and less power-hungry, as well as more immersive and intuitive for end users. Each new generation fuels the demands of end users a little more, and the day may yet come when smart glasses will give smartphones real competition.
LATEST NEWS