Optics Transforming AR, Wearables and Beyond
VALERIE C. COFFEY, SCIENCE WRITER,
stellaredit@gmail.comThe sea change of optics usually moves gradually, but today we are on the precipice of rapid change. As in the past, optics continues its coupling with other nonoptical technologies — fabless semiconductor manufacturing, microelectronics, nanotechnology, materials processing and software programming.
Optics still plays a central role in a complex whole. What’s new is the attention and investment from corporate giants including Apple, Amazon, Google, IBM, Microsoft and Samsung. All are planning to be involved in numerous ways in forthcoming optics-based technologies, which aim to be revolutionary. The way information is displayed, processed, illuminated and manipulated may look much different in a few short years (Figure 1).
Figure 1. Rollable, lightweight OLED displays that deploy from a pen-sized container are the vision for many display makers and may be just a few years away. Courtesy of Universal Display Corp.
Today’s smartphones involve OLED displays; 3D lasers; sensors and filters for facial recognition and depth sensing; optical brightness sensors; zoom lenses; and other camera optics. The coming year will see major strides toward layering augmented reality (AR) features over the optics in these devices, meshing the real world with virtual functionality. Unlike virtual reality (VR), where the user is completely submerged in a fictitious world, AR works by layering virtual content so that it’s incorporated into the view of the real world.
Early adaptations of augmented reality such as Pokémon Go and Snapchat selfie filters are evidence of its potential as a major phenomenon. The winner of the race to make more advanced AR/VR commonplace will win big.
That’s why companies such as Microsoft, Apple and Google are investing millions in research and introducing — in just the past year — several promising products and major initiatives in AR/VR.
In June 2017, Apple revealed an open-source software development kit (SDK) for its AR platform, ARKit, which will enable developers to design AR-based apps for iPhone users. The new framework uses a TrueDepth camera to accurately track the user’s face in real time, including position, topology and expression. ARKit incorporates visual inertial odometry to track the phone’s motion sensor data with respect to its surroundings, while optical sensors identify the brightness of a scene and detect horizontal planes such as tables and floors. The result is the ability to overlay digital objects and information with the environment of the user (Figure 2).
Figure 2. The ARCore platform, now in development, will enable Android phone users to overlay virtual content on scenes of the real world using existing optics, cameras and sensors. Courtesy of Google.
In response to Apple’s ARKit, Google introduced the ARCore initiative several months later, a platform for developers to build AR apps on Android smartphones, which can identify key points like horizontal surfaces or objects using the pre-existing motion tracking sensors, environmental machine learning and brightness estimation technologies in the Google Pixel and Samsung S8 phones. Both the Apple and Google AR platforms will enable users to select or interact with objects, define an anchor, place new objects and layer virtual information on real-world scenes.
On the higher end, Microsoft’s HoloLens will bring AR to wearable computers (Figure 3). HoloLens is the world’s first self-contained, wearable holographic computer, enabling users to engage with digital content and interact with 3D holograms in the world around them. Specialized components — including multiple optical, motion and audio sensors, advanced high-resolution, high-quality optics, and a custom holographic processing unit — will enable the device to incorporate what’s on the screen with what users see and hear beyond that. The mixed reality apps combine a view of the real world with holograms for applications including wireless holographic entertainment; biomedical applications; education; design; planning; and virtual exploration of travel destinations from Machu Picchu to Mars.
Figure 3. Microsoft’s HoloLens, available to software developers, will mesh augmented reality with a wearable computer, incorporating cameras with high-resolution lenses and high-accuracy sensors. Courtesy of Microsoft.
The HoloLens device is wireless, involving transparent lenses, waveguides, advanced cameras, sensors, spatial sound and a specialized holographic (data) processing unit (HPU) that enable freedom of movement to walk around holograms, or they can travel with you. The head-worn computer has sensors that allow you to move a cursor by turning your head, and software that enables simple gesture-based movement to open apps, select items, and drag and drop holograms into your view. The device enables voice-activated navigation, selection, opening, command and control of your apps. HoloLens is available as an SDK, but it’s costly at $3000.
The OLED revolution
The advances in wearable AR/VR require lightweight, wearable headsets with low power consumption and lag time. Toward that end, many future AR and VR systems, not to mention next-generation smartphones and tablets, hinge on display technology. In particular, organic LED (OLED) technology is likely to play a major role. It aims to replace heavy, breakable glass with lightweight, responsive, rugged plastic thin film.
In VR headsets, in which the user wears nontransparent goggles, one of the biggest technological hurdles has been the inherent delay between when you move your head and when the display updates to reflect that movement. This “sensory mismatch” can cause motion sickness and provide a vomit-inducing experience, according to Jeff Hebb, vice president of global marketing at Kateeva, a Newark, Calif., maker of inkjet manufacturing equipment for OLED displays.
“OLEDs offer response times 100× faster than LCDs,” said Hebb, “reducing the sensory mismatch problem, while weighing less and generating less heat for a more comfortable user experience.
“Also, OLED VR/AR displays must have long lifetimes, so encapsulation to protect the emitting layers from oxygen and moisture is absolutely necessary,” said Hebb. Kateeva is developing an RBG pixel printing process that will efficiently deposit the OLED emissive layers and protective encapsulation using equipment that is scalable to high volumes, thus reducing manufacturing costs.
Foldable displays
As in AR/VR, makers of smartphones and tablets are also betting on OLED displays. Samsung has incorporated these displays in its smartphones since 2008. In 2017, Apple, Google and LG launched their first OLED smartphones with edge-curved screens. Like the Samsung Galaxy S8, the OLED displays of the Apple iPhone X, Google Pixel 2 and LG V30 all offer the benefits of improved color accuracy, image contrast and wider viewing angles, as well as less power consumption, than LCDs. The displays are still protected with glass, for now. But manufacturers are neck and neck to be the first to eliminate the glass and finesse a foldable plastic OLED display, along with all the innards of the phone.
The holy grail is a tablet/smartphone hybrid featuring a large screen with the ability to fold into a neat pocket-sized stack. According to Mike Hack, vice president of business development at Ewing, N.J.-based Universal Display Corp. (UDC), the industry is getting very close to the finish line.
“The backplane, the transistors that control the display, the substrate, the polarizer, the touch panel — all these bendable parts have been developed,” he said. “Now we have to master how to yield them in mass production.”
Figure 4. Phosphorescent OLED (PHOLED) emitter materials in the form of powders are layered on thin-film displays to enable power-efficient, lightweight, emissive layers with various color, spectrum and linewidth properties. Courtesy of Universal Display Corp.
Foldable, lightweight, low-heat displays depend on advances in new materials, such as phosphorescent emitter compounds with efficient power conversion and long operational lifetime. Different layers of such materials are required in the emissive layers of OLED devices (Figure 4).
“Materials scientists at UDC are constantly working on these developments,” said Hack. Industry insiders concur that foldable displays will come to fruition in the next couple of years.
Figure 5. OLED technology is burgeoning as an effective and low-cost flexible display technology, power efficient and lightweight enough for military and other wearable applications. Courtesy of Universal Display Corp.
In addition to VR, smartphones and tablets, OLEDs are also emerging in large-format displays, lighting, automotive displays, entertainment, signage, military/aerospace applications and solar technology (Figure 5).
The quantum wave
Another major trend in optical technology poised to have a seismic impact is quantum computing (QC). The analog to a computer bit in QC is the quantum bit, or qubit, the memory element that enables quantum computers to solve certain types of problems a million times faster than current computers, such as modeling complex molecular interactions, analyzing large data sets or optimizing logistics routes.
The implications are revolutionary in a plethora of applications, including pharmaceuticals, genetics, oil and gas exploration, artificial intelligence, financial management and untold others. In an October 2017 report, the financial services firm Morgan Stanley estimated that the market for high-end quantum computers will double in size to $10 billion in the next decade. Google, IBM, Intel, Microsoft and Nokia Bell Labs, as well as many universities and governments, have doubled down on their investments in developing various QC components and architectures.
OLED technology aims to replace heavy, breakable glass with lightweight, responsive, rugged plastic thin film.
In fact, the pace of development of every part of quantum computers has been so fast over the past few years that companies are now testing the working components of hybrid quantum computers. The architectures of these working QCs are as varied as the companies (and sometimes varying within a single company), involving a combination of mature semiconductor technology with optical technology such as single-photon sources, photonic qubits, optical interconnects, waveguides and single-photon detectors. Scientists at the University of California, Santa Barbara, designed an architecture adopted by Google, based on superconducting qubits in a closed loop. Nokia Bell Lab is exploring an architecture based on topological quantum computing, which leverages the properties of non-Abelian anyons to move charges between two states. Nokia’s approach involved using a plane of supercooled, superconducting materials to form composite fermion pairs. Like Nokia, Microsoft is pursuing quantum computing via topological effects, except it is using crystalline nanowire networks to allow “braiding” of Majorana fermions.
Currently, the main QC goal is scaling up the number of qubits, which are inherently unstable due to quantum decoherence; the more qubits we interconnect, the harder it is to control them
1. Various approaches to qubits include Bose-Einstein condensates, trapped ions and control of photons encoded via frequency, angular momentum or polarization states. IBM is betting on superconducting transmon qubits in Josephson junctions to create what is essentially an artificial atom with two distinct states of oscillation.
In March 2017, IBM launched IBM Q, an initiative to build a commercially available QC system with open-source accessibility via the cloud. According to Anthony Annunziata, associate director of the IBM Q Network in Yorktown Heights, N.Y., the company’s approach uses well-established semiconductor techniques to create a universal gate-based quantum processor that can run algorithms and solve otherwise intractable problems in business and science.
In May 2017, IBM Q announced an open quantum processor with 16 qubits available for beta access and a 17-qubit commercial prototype (Figure 6). In September, a seven-qubit quantum processor from IBM Q successfully solved the molecular structure problem of beryllium hydride (BeH
2), the largest molecule simulated on a quantum computer to date
2.
Figure 6. The core technology in this novel quantum 16-qubit chip by IBM Q is superconducting transmon qubits with Josephson junctions. The 16-qubit system is available to developers on an open-source platform. Courtesy of IBM Q.
“Quantum computers promise to greatly assist in discovering new drug compounds and new materials, while saving time, money and resources in solving difficult optimization problems,” said Annunziata. “We’re at a very exciting stage where the field is progressing rapidly. As the number of qubits grows, optical technology will link quantum processors together and connect them all to the cloud.”
The world record number of interconnected qubits in a simulated quantum system is currently 45 — well on its way to a magic “quantum supremacy” number of 49 qubits, which is the point at which QCs could best today’s supercomputers. Many experts feel quantum supremacy could come to fruition in the next few years.
However, a single quantum computer is only one part of a longer-term and more complex goal: a quantum network that can transmit photons over appreciable distances. To that end, Peter Lodahl, professor and head of the Quantum Photonics group at the Niels Bohr Institute at the University of Copenhagen, Denmark, along with international colleagues, are working on photonic networks based on quantum dots embedded in photonic nanostructures
3. Which quantum architecture will scale most successfully to a larger network? It’s too early to predict a winner, and some think it’s unnecessary.
“Quantum technology can be applied to many things,” said Lodahl. “We don’t have to just chase a universal quantum computer. Rather, the effort worldwide is toward development of quantum hardware for a range of applications, including quantum simulators, quantum key distribution, quantum repeaters, quantum sensors and ultimately, quantum networks. Development of hybrid approaches will play a key role.”
References
1. S. Deangelis (Oct. 2014). Closing in on quantum computing.
Wired online.
2. A. Kandala et al. (Sept. 2017). Hardware-efficient variational quantum eigensolver for small molecules and quantum magnets.
Nature, Vol. 549, p. 242.
3. P. Lodahl et al. (Jan. 2017) Chiral quantum optics.
Nature, Vol. 541, p. 473.
LATEST NEWS