Search
Menu
PFG Precision Optics - Precision Optics 12/24 LB

Video Tech Enables Imaging Through Scattering Media

Facebook X LinkedIn Email
HOUSTON, July 4, 2023 — Researchers from Rice University and University of Maryland have created full-motion video technology that could enable cameras to peer through fog, smoke, driving rain, murky water, skin, bone, and other media that reflect scattered light and obscure objects from view.

According to Ashok Veeraraghavan, professor of electrical and computer engineering at Rice University, imaging through scattering media is the “holy grail problem” in optical imaging. “Scattering is what makes light — which has lower wavelength, and therefore gives much better spatial resolution — unusable in many, many scenarios. If you can undo the effects of scattering, then imaging just goes so much further.”

Veeraraghavan’s lab collaborated with the research group of Maryland co-corresponding author Christopher Metzler to create a technology they named NeuWS, which is an acronym for “neural wavefront shaping,” the technology’s core technique.

“If you ask people who are working on autonomous driving vehicles about the biggest challenges they face, they’ll say, ‘Bad weather. We can’t do good imaging in bad weather,’” Veeraraghavan said. “They are saying ‘bad weather,’ but what they mean, in technical terms, is light scattering. If you ask biologists about the biggest challenges in microscopy, they’ll say, ‘We can’t image deep tissue in vivo.’ They’re saying, ‘deep tissue’ and ‘in vivo,’ but what they actually mean is that skin and other layers of tissue they want to see through are scattering light. If you ask underwater photographers about their biggest challenge, they’ll say, ‘I can only image things that are close to me.’ What they actually mean is light scatters in water, and therefore doesn’t go deep enough for them to focus on things that are far away.

“In all of these circumstances, and others, the real technical problem is scattering,” Veeraraghavan said.
In experiments, camera technology called NeuWS, which was invented by collaborators at Rice University and the University of Maryland, was able to correct for the interference of light scattering media between the camera and the object being imaged. The top row shows a reference image of a butterfly stamp (left), the stamp imaged by a regular camera through a piece of onion skin that was approximately 80 microns thick (center) and a NeuWS image that corrected for light scattering by the onion skin (right). The center row shows reference (left), uncorrected (center) and corrected (right) images of a sample of dog esophagus tissue with a 0.5 degree light diffuser as the scattering medium, and the bottom row shows corresponding images of a positive resolution target with a glass slide covered in nail polish as the scattering medium. Close-ups of inset images from each row are shown for comparison at left. Courtesy of Veeraraghavan Lab/Rice University.
In experiments, camera technology called NeuWS, which was invented by collaborators at Rice University and University of Maryland, was able to correct for the interference of light-scattering media between the camera and the object being imaged. The top row shows a reference image of a butterfly stamp (left), the stamp imaged by a regular camera through a piece of onion skin that was approximately 80 µm thick (center), and a NeuWS image that corrected for light scattering by the onion skin (right). The center row shows reference (left), uncorrected (center), and corrected (right) images of a sample of dog esophagus tissue with a 0.5° light diffuser as the scattering medium. The bottom row shows corresponding images of a positive resolution target with a glass slide covered in nail polish as the scattering medium. Close-ups of inset images from each row are shown for comparison. Courtesy of Veeraraghavan Lab/Rice University.

NeuWS could potentially be used to overcome scattering in those scenarios and others.

Conceptually, NeuWS is based on the principle that lightwaves are complex mathematical quantities with two key properties that can be computed for any given location. The first, magnitude, is the amount of energy the wave carries at the location, and the second, phase, is the wave’s state of oscillation at the location. Veeraraghavan and Metzler said measuring phase is critical for overcoming scattering, but it is impractical to measure directly because of the high frequency of optical light.

Instead, they measure incoming light as “wavefronts,”  single measurements that contain both phase and intensity information, and use backend processing to quickly decipher phase information from several hundred wavefront measurements per second.

“The technical challenge is finding a way to rapidly measure phase information,” said Metzler, an assistant professor of computer science at Maryland and a Rice alum. Metzler was at Rice University during the development of an earlier iteration of wavefront-processing technology called WISH that Veeraraghavan and colleagues published in 2020.

COMSOL Inc. - Find Your Best Idea MR 12/24

“WISH tackled the same problem, but it worked under the assumption that everything was static and nice,” Veeraraghavan said. “In the real world, of course, things change all of the time.”

With NeuWS, he said, the idea is to not only undo the effects of scattering, but to undo them fast enough so the scattering media itself doesn’t change during the measurement.

“Instead of measuring the state of oscillation itself, you measure its correlation with known wavefronts,” Veeraraghavan said. “You take a known wavefront, you interfere that with the unknown wavefront, and you measure the interference pattern produced by the two. That is the correlation between those two wavefronts.”
Rice University Ph.D. student Haiyun Guo and Prof. Ashok Veeraraghavan in the Rice Computational Imaging Laboratory. Guo, Veeraraghavan and collaborators at the University of Maryland have created full-motion video camera technology that corrects for light-scattering and has the potential to allow cameras to film through fog, smoke, driving rain, murky water, skin, bone and other light-penetrable obstructions. Courtesy of Brandon Martin/Rice University.
Rice University Ph.D. student Haiyun Guo and professor Ashok Veeraraghavan in the Rice Computational Imaging Laboratory. Guo, Veeraraghavan, and collaborators at University of Maryland have created full-motion video camera technology that corrects for light scattering and has the potential to allow cameras to film through fog, smoke, driving rain, murky water, skin, bone, and other light-penetrable obstructions. Courtesy of Brandon Martin/Rice University.

Metzler used the analogy of looking at the North Star at night through a haze of clouds. “If I know what the North Star is supposed to look like, and I can tell it is blurred in a particular way, then that tells me how everything else will be blurred.”

Veerarghavan said, “It’s not a comparison, it’s a correlation, and if you measure at least three such correlations, you can uniquely recover the unknown wavefront.”

State-of-the-art spatial light modulators can make several hundred such measurements per minute, and Veeraraghavan, Metzler, and colleagues showed they could use a modulator and their computational method to capture video of moving objects that were obscured from view by intervening scattering media.

“This is the first step, the proof-of principle that this technology can correct for light scattering in real time,” said Rice’s Haiyun Guo, one of the study’s lead authors and a Ph.D. student in Veeraraghavan’s research group.

In one set of experiments, for example, a microscope slide containing a printed image of an owl or a turtle was spun on a spindle and filmed by an overhead camera. Light-scattering media were placed between the camera and target slide, and the researchers measured NeuWS’ ability to correct for light scattering. Examples of scattering media included onion skin, slides coated with nail polish, slices of chicken breast tissue, and light-diffusing films. For each of these, the experiments showed that NeuWS could correct for light scattering and produce clear video of the spinning figures.

“We developed algorithms that allow us to continuously estimate both the scattering and the scene,” Metzler said. “That’s what allows us to do this, and we do it with mathematical machinery called neural representation that allows it to be both efficient and fast.”

NeuWS rapidly modulates light from incoming wavefronts to create several slightly altered phase measurements. The altered phases are then fed directly into a 16,000-parameter neural network that quickly computes the necessary correlations to recover the wavefront’s original phase information.

“The neural networks allow it to be faster by allowing us to design algorithms that require fewer measurements,” Veeraraghavan said.

Metzler said, “That’s actually the biggest selling point. Fewer measurements, basically, means we need much less capture time. It’s what allows us to capture video rather than still frames.”

The work was supported by the Air Force Office of Scientific Research, the National Science Foundation, and the National Institutes of Health, and partial funding for open access was provided by the University of Maryland Libraries’ Open Access Publishing Fund.

The research was published in Science Advances (www.doi.org/10.1126/sciadv.adg4671).

Published: July 2023
Glossary
wavefront
A wavefront refers to the continuous surface or boundary representing points in a wave that are in phase, meaning they have the same phase or position in their respective cycles. In simpler terms, it's the front edge of a wave as it propagates through a medium. For example, in a water wave, the wavefront would be the crest of the wave, representing the points where the water's surface reaches its highest elevation. Similarly, in a sound wave, the wavefront would represent the points of...
scattering
Change of the spatial distribution of a beam of radiation when it interacts with a surface or a heterogeneous medium, in which process there is no change of wavelength of the radiation.
photonics
The technology of generating and harnessing light and other forms of radiant energy whose quantum unit is the photon. The science includes light emission, transmission, deflection, amplification and detection by optical components and instruments, lasers and other light sources, fiber optics, electro-optical instrumentation, related hardware and electronics, and sophisticated systems. The range of applications of photonics extends from energy generation to detection to communications and...
artificial intelligence
The ability of a machine to perform certain complex functions normally associated with human intelligence, such as judgment, pattern recognition, understanding, learning, planning, and problem solving.
algorithm
A precisely defined series of steps that describes how a computer performs a task.
phase
In optics and photonics, "phase" refers to a property of electromagnetic waves, such as light, that describes the position of a wave at a given point in time within its oscillation cycle. More specifically, it indicates the position of a wave relative to a reference point, typically the starting point of a cycle. When discussing phase in optics, it's often described in terms of the phase difference between two waves or the phase of a single wave. The phase difference between two waves is the...
frequency
With reference to electromagnetic radiation, the number of crests of waves that pass a fixed point in a given unit of time, in light or other wave motion. Expressed in hertz or cycles per second.
magnitude
In astronomy, the relative brightness of a celestial body. Originally a scale from 1 to 6, where 1 represented the brightest and 6 the faintest visible night sky objects. This scale has been expanded to include negative integers, integers greater than 6 and decimal values. A decrease of 1 in magnitude indicates an increase in brightness by a factor of 2.512 times. In mathematics, the magnitude of an object is often used to describe the size or length of that object. In optics, the magnitude is...
Research & TechnologyImagingOpticscomputationalneuralneural wavefront shapingwavefrontscatteringmediaBiophotonicsMicroscopyphotonicsartificial intelligenceAIalgorithmphasefrequencymagnitudeRice UniversityUniversity of MarylandAmericas

We use cookies to improve user experience and analyze our website traffic as stated in our Privacy Policy. By using this website, you agree to the use of cookies unless you have disabled them.