For optics, part of the future is up in the air. Airborne imaging depends on optics to get light where it needs to go. Now, larger detectors in traditional imaging bands, and others that operate in relatively new spectral ranges, are forcing optical components to evolve to keep up and deliver the clearest possible image. At the same time, aerial-directed energy weapons — primarily laser-based — are putting pressure on optical component performance. Mounted in structural assemblies, optics help ensure airborne sensors produce the clearest possible image. Courtesy of Thales Group. Upping what components can deliver is critical because otherwise they can neutralize advances in other areas. Longwave-infrared detectors, for instance, capture light at 6- to 15-µm wavelengths. Pixel sizes have fallen over the past few years from more than 25 µm, down to 10 or so, with an accompanying increase in resolution. But this improvement means little if the optics don’t also get better. “The reality is a poor lens, even on the world’s best detector, will give a poor image,” said Peter Williams, business development manager for precision optics at Gooch & Housego in Ilminster, England. The company researches, designs, engineers and manufactures advanced photonic systems, components and instrumentation. The optical component industry is turning to innovations in materials and manufacturing to keep up with the latest detector and laser system developments. The goal is to have solutions that perform well, as measured against SWaP-C — an acronym used by the aerospace and defense industries that stands for size, weight, power and cost. Advances in these four areas benefit aerospace and defense applications. And sometimes it’s even possible to see improvements in several of them simultaneously. Unmanned aerial vehicles use increasingly sophisticated optics for surveillance of large areas, intelligence gathering, targeting and other applications. Courtesy of Thales Group. Infrared optics As is the case for other parts of the spectrum, companies can improve performance by making use of aspheric surfaces. These deviate from the curve of an ideal surface described by spherical equations and do so sometimes significantly. As a result, they produce imaging performance not possible with conventional, spherical optics. They are, however, hard to manufacture and even harder to measure, largely because past techniques have all been designed to make and measure spherical surfaces. But infrared optics can have an easier time with aspherics because of the wavelength at which they operate. Surfaces can be machined using diamond point turning techniques so that they are diffractive, enabling different wavelengths to be directed along different paths within the optical system. This means that color correction and additional adjustments can be combined with other functions, reducing the size of the solution by merging multiple elements and components into one. “You can design that into a structure of one element,” Williams said. “So, effectively, you could, for example, reduce that from five elements to three. So if you do that, you shorten it (the optical system), you’ve lightened it and you take some of the cost away.” He added that the cost of manufacturing the combination diffractive element will be higher than that of the multiple simple pieces it replaces. However, the cost savings that arise from eliminating multiple components generally makes up for this increase. According to Williams, the optical assembly is often the last part of the instrument to be designed. By then, the other parts of the imaging system — the detector, electronics, processor and aiming system — have largely been done. Thus, optical designers have to fit their part of the device into whatever space is left over. This may require one or more folds in the path, which effectively cuts the total length in half or more. In all of this, it is important that the optics deliver light in such a way that the detector is fully illuminated and completely filled. That is getting harder as sensors reach larger and larger pixel counts. Some years ago, for instance, the U.S. Army unveiled a 1.8-GP video system capable of conducting surveillance over a 10-square-mile area when flown at a height of 20,000 feet. Dubbed ARGUS-IS and comprised of 368 five-megapixel smartphone camera sensors, the system could resolve details as small as six inches at a distance of several miles. As this example shows, enormous pixel counts are now possible. Spectrum-wide detectors In addition to increasing pixel counts, detectors can operate across new swaths of the spectrum. Visible detectors operating in the 400- to 700-nm range, and mid- and longwave-IR sensors in the 3- to 6-µm and 6- to 15-µm ranges, respectively, have long been around. Now also gaining in popularity are shortwave-IR detectors. These operate from 0.9 to 1.7 µm. Similar to visible sensors, these need ambient light to work. Additionally, shortwave-IR allows systems to peer through smoke and other obscurants at a high resolution, if the optics exist to do so. “Detectors are getting better by the year and that makes great demands on the optics,” Williams said. “If you’re in excess of 2000 pixels [on a side], that inevitably means a bigger detector, but it also means your optics have to perform exceptionally well.” Multiplying optical channels such as shortwave-IR is a trend that makes it more challenging to hit SWaP-C targets, according to François-Hugues Gauthier, chief technology officer for optronics activities at the Paris-based Thales Group. The company’s optronics division is a multinational manufacturer of components for aerospace and defense applications. One way in which size and weight are being reduced is through the use of zoom lens technology; specifically, approaches that allow continuous magnification changes over a particular range. Optics-based systems have to be protected in flight, which may be done through a clear enclosure or via a retractable system. Courtesy of Thales Group. “Zooms are also often replacing multi-focal lenses in several products, giving the capability to the user to adapt the field of view to the mission,” Gauthier said. This can be useful because the field-of-view requirements can change, sometimes within a span of minutes. When flying a helicopter to a location, for example, a pilot may want a very large field of regard, which takes into account possible movement of the sensor along the line of sight and thus is the total area that can be captured. But when observing or targeting, that field of regard needs to be narrower and possibly may need to encompass different spectral bands. Unfortunately for optics, a single solution is unlikely to offer the best possible performance under all circumstances. A fisheye lens in the visible will not use the same components as a system offering 10× zoom in the infrared, Gauthier said. Thus, designers have to make choices and accept trade-offs, all while trying to reduce SWaP-C. “The optimization of the system consists in finding the best compromise between different parameters such as the size of the entrance lens of an optical system, the focal length, the spectral band, the detector and also a lot of other elements such as mechanics, thermal effects, line of sight orientation and stabilization, bore sighting, and so on,” Gauthier said. Imaging challenges Imaging has its own challenges, including having to deal sometimes with too few photons. On the other hand, directed energy weapons always have to contend with many, many photons. Instead of gathering distant photons onto a detector, directed energy weapons attempt to get photons from a source to the target. Because of the photon flux involved, this has to be done in such a way that most of the energy does not end up in the optics. That is primarily handled through coatings — materials that are often deposited on surfaces sometimes a layer at a time. Gauthier noted that coatings can enhance reflection and transmission while suppressing absorption. Some spectral combinations have to be avoided, he said, because the possible reflection, transmission and absorption characteristics are unacceptable.V “Absorption in coatings cannot be totally suppressed but can be drastically reduced using the right choices of material and deposition techniques,” Gauthier said. Due to detector improvements in resolution, optical components have had to increase performance, which can mean, among other things, surfaces must be smoother. Courtesy of Gooch & Housego. The need for such coatings and the day of high-power aerial lasers may not be that far off. Demonstration systems flying a chemical oxygen iodine laser in a modified Boeing 747 successfully destroyed two missiles early in 2010. The U.S. Air Force has said that it’s on track to demonstrate a working defensive laser weapon for fighters by 2020. Other nations are working on similar weapons, according to U.S. Air Force officials. When it comes to coatings, water is the enemy and greater density is to be desired, according to Iwan Dodd, director of business development for aerospace and defense within Gooch & Housego. The two are related, with density important for several reasons. A denser coating is both more robust and also better able to reflect or transmit photons, two attributes that help the optics deal with the flux created by a directed energy weapon. Greater density arises when water is kept out the deposition process. While progress in coatings has been made, more is needed, Dodd noted. That is a challenge, given that wavelengths from the ultraviolet below 400 nm to the infrared above 10 µm may be involved. No single coating material and deposition technique is optimum across this entire span for all applications. The industry is working to meet these needs. “Coatings do play a big role in the performance of the lens,” Dodd said. “And that’s where we have a huge investment.”