Search
Menu
Lambda Research Optics, Inc. - Mission
Photonics HandbookBioScan

Microscopy Method Monitors Deep-Brain Metabolic Changes

Facebook X LinkedIn Email
A microscopy system developed by researchers at MIT addresses the challenges of using all-optical imaging techniques to visualize metabolic changes and neuronal activity deep within the brain. Using the system, which combines acoustic imaging and multiphoton excitation, the researchers achieved exceptional depth and sharp images by combining several advanced technologies into one microscope. 

In the system, an ultrasound microphone located in the microscope detects the acoustic waves, and the recorded sound data is converted into high-resolution images. The solution additionally uses a near-infrared femtosecond (NIR-fs) laser for excitation, ensuring that the wavelength is long enough to penetrate deeply into tissue.

“We merged all these techniques — three-photon, label-free, photoacoustic detection,” researcher Tatsuya Osaki said. “We integrated all these cutting-edge techniques into one process to establish this ‘Multiphoton-In and Acoustic-Out’ platform.”
MIT’s new multiphoton, photoacoustic, label-free microscopy system. Courtesy of MIT Picower Institute/Tatsuya Osaki.
A multiphoton, photoacoustic, label-free microscopy system. Courtesy of MIT Picower Institute/Tatsuya Osaki.

The label-free, multiphoton, photoacoustic microscope (LF-MP-PAM) could provide a way to monitor the metabolic changes in brain cells, both in vitro and in vivo, and measure brain activity without the need for external labels. Further, it could be used to investigate the normal and pathological mechanisms underlying neurodegenerative diseases and psychiatric disorders.

Using LF-MP-PAM with a 1300-nm fs laser, the researchers identified and imaged endogenous NAD(P)H at the single-cell level in living cultured cells, mouse brain slices, and human cerebral organoids. Nicotinamide adenine dinucleotide (NAD), a critical molecule in the cellular metabolic pathway, exists in two forms — oxidized (NAD(P)+ and reduced NAD(P)H. Real-time detection of NAD(P)H in the brain could serve as a biomarker for assessing the activity of neurons during normal functioning and disease progression.

The researchers confirmed the NADH photoacoustic signal with standard NADH imaging, validating the character of the photoacoustic energy, frequency, and acoustic transit time. They introduced NADH in cells and observed an increase in photoacoustic signals, which was confirmed through conventional, fluorescent-based NAD(P)H sensors. Although NAD(P)H emitted a weak fluorescent signal, the absorbed energy produced a localized thermal expansion of about 10 μm within the cell, generating sound waves that traveled with relative ease through the tissue compared to fluorescence emission.

Using photoacoustic detection, the researchers accessed depths of 700 μm in the mouse brain slices and 1100 μm in cerebral organoids from human stem cells. According to the team, LF-MP-PAM penetrated human organoid tissue at more than 5x the depth of other microscopy technologies. The researchers also achieved a strong photoacoustic NAD(P)H signal in the brain slices, at a depth 6x greater than the reported optical imaging depth for NAD(P)H.

Meadowlark Optics - Spatial Light Modulator MR 2025

“The major advance here is to enable us to image deeper at single-cell resolution,” professor Mriganka Sur said.

The researchers also developed an imaging subsystem and integrated it into the LF-MP-PAM platform to demonstrate a photoacoustic-generated spatial map of NAD(P)H in organoid and brain slice cells. They demonstrated simultaneous third-harmonic-generation imaging from three-photon stimulation, producing detailed renderings of cellular structures.
In this edited version of a figure from the research, NAD(P)H molecules within cells in a cerebral organoid are detected photoacoustically (in blue on left) and optically (black and white on right). Image depth is 0.2 mm. Courtesy of MIT.
In this edited version of a figure from the research, reduced nicotinamide adenine dinucleotide molecules within cells in a cerebral organoid are detected photoacoustically (in blue on left) and optically (black and white on right). Image depth is 0.2 mm. Courtesy of MIT.

The team continues to refine the system’s signal processing capabilities, and is looking ahead to potential applications for LF-MP-PAM in neuroscience and clinical settings. It has already established that NAD(P)H imaging can inform wound care.

Levels of the NAD(P)H molecule in the brain are known to vary in patients who experience seizures and various neurological disorders like Alzheimer’s disease and Rett syndrome, making NAD(P)H a potentially valuable biomarker for these conditions. Because the new system is label-free, it could be used for deep tissue imaging during surgeries.
 
The next step for the researchers will be to demonstrate LF-MP-PAM in a live animal. To move beyond in vitro and ex-vivo tissue imaging, the team will first need to reposition the microphone to be on top of the sample, like the light source.
 
The research was published in Light: Science & Applications (www.doi.org/10.1038/s41377-025-01895-x).

Published: August 2025
Glossary
photoacoustic imaging
Abbreviated PAI. An imaging modality with a hybrid technique based on the acoustic detection of optical absorption from endogenous chromophores or exogenous contrast agents. Light is absorbed by the chromophores and converted into transient heating, and through thermoelastic expansion there is a resulting emission of ultrasonic waves. In tissue, ultrasound scatters less than light, therefore PAI generates high-resolution images in the diffusive and optical ballistic regimes compared to purely...
Research & TechnologyeducationAmericasMITMassachusetts Institute of TechnologyImagingphotoacoustic imagingLasersLight SourcesMicroscopymultiphoton microscopyOpticsultrafast lasersnear infraredBiophotonicsmedicaldeep tissue imagingbrain imagingBioScan

We use cookies to improve user experience and analyze our website traffic as stated in our Privacy Policy. By using this website, you agree to the use of cookies unless you have disabled them.