Consumer AR/VR Headsets Focus on Nontraditional Optics
FAROOQ AHMED, CONTRIBUTING EDITOR
The promise of commercial augmented and virtual reality (AR/VR) technologies seems always to be just around the corner. Nearly three decades have passed since the Sega Corporation announced its Sega VR gaming headset, which was one of the first of its kind. But the Japanese company could not produce it in volume for home use.
North Inc.’s Focals glasses allow wearers to discreetly view notifications from their smartphones. Courtesy of North Inc.
Despite enthusiasm for such devices from nearly all sectors — consumers, researchers, educators and academics, gaming companies, and numerous industries, including health care — the routine use of such headsets and related eyewear has evaded commonplace use. Military applications have far surpassed consumer-facing ones.
Several factors have confounded the development and widespread adoption of the technology. On the engineering side, many of the necessary optoelectronic components are just beginning to mature and miniaturize. These components must then come together in a format that will comfortably fit users who have differing facial structures, interpupillary distances, and tolerances for bulky headwear.
In addition, AR/VR headsets can cause eyestrain, headaches, dizziness, and nausea. These unintended effects helped doom the Sega VR.
Conflicting views
Over the past 30 years, technological development has generally followed Moore’s law; processors have gotten faster and components have increased in capability, while shrinking in size. But smart glasses and headsets for AR/VR have only made the occasional and fleeting appearance for consumers (Figure 1).
Figure 1. The University of Arizona’s Hong Hua developed a light-field AR display that uses freeform optics. Courtesy of Hong Hua/College of Optical Sciences, University of Arizona.
“One of the problems,” said Hong Hua, professor at the University of Arizona’s (UA’s) College of Optical Sciences, “is that lenses don’t obey Moore’s law.” Her 2017 paper on focus cues in head-mounted displays remains a highly referenced overview of one of the most vexing issues that AR/VR engineers face — the vergence-accommodation conflict (VAC)
1.
Typically, when we look at something, our eyes converge on the object, and eye muscles change the shape of our lenses so that we can focus on it. However, in virtual environments such as the ones created by conventional head-mounted displays, our eyes must simultaneously view a flat, 2D screen being imaged by an eyepiece at a fixed distance from our face while we focus on simulated objects that appear to be at different depths. The conflict between the image we’re trying to see and where we must look to see it can lead to eyestrain, fatigue, and dizziness.
Up to 40 percent of users develop some form of VR-associated sickness, even after wearing the displays for relatively short periods of time.
“The decoupling of accommodation and convergence cues in stereoscopic displays leads to discomfort,” Hua said. She added that solutions to the VAC problem involve providing users with appropriate focus cues for each eye at different focal depths.
At least a half dozen methods exist for overcoming VAC. They can be broken down into two general categories of multiplexing approaches: those that use a static or spatial method and those that use a dynamic or time-division one.
Multiplexing provides a way to split an image so that it appears at the correct focal plane for each eye. A large challenge remains in engineering VAC solutions for AR/VR systems that offer a wide field of view (the arc of what’s observable from a stationary point) and a large eye box (the area in which one’s eyes can scan and have the objects remain in focus).
“The next-generation spatial-computing platforms that everyone wants need to be really compact,” Hua said. “But many solutions to VAC require adding optics or other hardware.”
Stefan Alexander, vice president of advanced research and development at
Ontario, Canada-based North Inc., agreed.
“Most VAC solutions cause more problems in other human factor areas than they solve,” he said, “for example, [in] weight, size, brightness, transparency, image quality, and so on.”
North has made headway in reaching consumers in the AR wearables sector through its Focals glasses, which connect to a wearer’s smartphone and display a variety of information, including text messages, turn-by-turn directions, and appointments and alarms. Released last year to consumers, the glasses were developed in-house by the company.
A polarizing approach
One new solution to solving VAC won first place in the Optical Design Challenge at the 2019 SPIE Photonics West conference in San Francisco in February. Guanjun Tan, a graduate student in the lab of professor Shin-Tson Wu at the University of Central Florida (UCF) College of Optics and Photonics, showcased a method that uses bifocal, Pancharatnam-Berry phase lenses and a spatial polarization modulator to perform the multiplexing
2. The lenses produce multiple focal planes simultaneously, while the spatial polarization modulator presents each eye with an image at the correct focal plane (Figure 2).
Figure 2. Guanjun Tan of the University of Central Florida designed a polarization-multiplexed multiplane display that overcomes the vergence-accommodation conflict (VAC). PBL: Pancharatnam-Berry lens. Courtesy of Guanjun Tan/CREOL, The College of Optics and Photonics, University of Central Florida.
Tan said he was inspired to tackle this entrenched AR/VR issue because he found current commercial headsets to be lacking. “Conventional displays typically only have one focal plane. But with our binocular vision, we need about six focal planes active at a time. Our method gets closer to this requirement in a small footprint, while avoiding some of the problems of spatial and dynamic multiplexing.”
The UA’s Hua agreed that combining the bifocal lenses with a polarization modulator was a good solution for producing multiple focal planes. She said her lab used polarization to solve another common problem that affects AR/VR displays — image brightness in well-lit environments — by designing a polarized head-mounted display
3.
The freeform future
A solution to help decrease the size of AR/VR headsets, according to Jannick Rolland, is to depart from solely relying on traditional lenses and to embrace freeform optics instead. Rolland is a professor of optical engineering at the Institute of Optics at the University of Rochester and chief technical officer of LighTopTech, a noninvasive 3D-microscopy imaging company that she helped start in 2013.
“We need to use more complex lens shapes to design the sunglass-like form factor with [the] large fields of view that people want,” she said.
Freeform optics are essentially nonsymmetrical lenses that can be shaped, contoured, and stacked to produce a desired functionality. Their surfaces can now be machined to nanometer precision, and they have been used in nonimaging applications such as car headlights since the early 1990s — when precision was achieved only to the micron level (Figure 3). Rolland has been exploring freeform optics for more than a decade.
Figure 3. In Matthew Davies’ lab at the University of North Carolina at Charlotte, a mechanical engineering doctoral student holds a prototype freeform optic designed for a compact telescope. The design was developed by the University of Rochester’s Aaron Bauer in collaboration with Jannick Rolland. Courtesy of Matthew Davies/University of North Carolina at Charlotte.
Because of the atypical shapes of freeform optics, they produce field-dependent aberrations with significantly different patterns from traditional lenses. Early attempts at fabrication were hampered by the lack of a formal measurement or metrology system that would ensure the fabricated shapes were compliant with the designed ones. “Many labs were taking a brute-force approach,” Rolland said.
In May 2018, her group published a methodology to design freeform surfaces within an optical system that accounts for aberrations and emphasizes manufacturability
4. She said, “We now have a very good handle on the basic theory and method. We have also made a lot of progress in fabrication — with many kinds of materials.”
Volume manufacturing of freeform
optics for the consumer market could
deliver lightweight head-mounted displays with appropriate corrections for aberrations as well as VAC. Six years ago, to help guide development of the components and solve obstacles in their design, Rolland helped spearhead the Center for Freeform Optics, which is a collaboration between industry and academia and is sponsored by the National Science Foundation. The universities of Rochester and North Carolina at Charlotte lead the center on the academic side. Corporate and government members with AR/VR interests include Google, Facebook Reality Labs, the Air Force Research Lab, Ball Aerospace, Collins Aerospace, Thales, Jabil, Synopsys, and ZEISS.
Rolland said the complexities of AR/VR headsets, as well as the promise of freeform optics, were major factors that led to the establishment of the center. “It’s a partnership focused on precompetitive research [that] allows companies, even close competitors, to participate side by side. A consortium is a great way to bring our community together with a common vision to accelerate our knowledge and impact our optics industry in ways we never imagined 10 years ago.”
The UA’s Hua has also been exploring freeform optics. In collaboration with Wu, Tan’s adviser at UCF, she developed a digitally switchable, multifocal, freeform lens
5. The headset uses an electronically programmable optical shutter array to switch the light path, which varies the foci. Hua believes the device may be useful for addressing VAC.
“I cannot necessarily say that freeform optics has to be tied to a solution for vergence-accommodation conflict problems, but it just happens to be, in a system that we developed,” she said.
North face
North’s Focals also employ freeform optics. In addition to the lenses, the glasses use a laser microelectromechanical projector with a holographic combiner that creates multiple exit pupils to display information discreetly and in multicolor to the wearers (opening image, p. 30).
“Freeform surfaces created a superior end product for us,” North’s Alexander said. “However, the extra engineering effort was significant, and we expect reducing that effort to continue to be an industry focus.”
North was founded in 2012 “with a vision to change the way people interacted with technology and the world around them,” Alexander said. After developing
the Myo armband, which allowed wearers
to control their computers by moving
their arms, North pivoted to focus on
head-mounted displays for spatial computing. At their Brooklyn, N.Y., and Toronto showrooms, North takes a 3D scan of wearers’ heads to custom fit the glasses. “We started out with our top priorities,” he said. “People should look good wearing Focals. [The glasses] should be comfortable and they should be useful.”
Alexander believes the greatest potential for head-mounted displays lies with consumers. “The decisions and trade-offs we made around Focals [were made] with them in mind.”
www.linkedin.com/in/farooqtheahmed
Acknowledgments
The author would like to thank Hong Hua,
the University of Arizona; Guanjun Tan,
the University of Central Florida; Stefan
Alexander, North Inc.; and Jannick Rolland, the University of Rochester.
References
1. H. Hua (2017). Enabling focus cues in
head-mounted displays.
Proc IEEE,
Vol. 105, No. 5, pp. 805-824.
2. G. Tan et al. (2018). Polarization-
multiplexed multiplane display.
Opt Lett, Vol. 43, Issue 22, pp. 5651-5654.
3. R. Zhang and H. Hua (2008). Design of a polarized head-mounted projection display using ferroelectric liquid-crystal-on-silicon microdisplays.
Appl Opt, Vol. 47, Issue 15, pp. 2888-2896.
4. A. Bauer et al. (2018). Starting geometry creation and design method for freeform optics.
Nat Commun, Vol. 9, p. 1756.
5. X. Wang et al. (2018). Digitally switchable multi-focal lens using freeform optics.
Opt Express, Vol. 26, pp. 11007-11017.
LATEST NEWS