Conventional microscopy methods can image organelles in live cells, but it is difficult for them to capture the organelles’ interactome at system level, due to the small size, fast dynamics, and diversity of organelle types. Traditional microscopy can also cause damage to the sample through phototoxicity and photobleaching. To visualize the substructures and dynamic interactions within living cells, researchers developed an imaging technology that draws on AI and deep learning to image intracellular activity at superresolution. The new imaging technique, developed by researchers at Peking University, Ningbo Eastern Institute of Technology, and the University of Technology Sydney, achieves fast segmentation and multiplexed imaging of organelles and their interaction within live cells. The ability to visualize multiple cellular processes at the same time could provide insight into the origin of diseases like cancer, neurodegenerative disorders, and metabolic conditions. According to professor Dayong Jin, traditional microscopy methods struggle to simultaneously show multiple structures within a cell because they are restricted in the number of colors they can use. The multiplexing imaging technique enables organelle segmentation on different biology samples by using universal lipid staining and deep learning. Courtesy of the University of Technology Sydney. To enable multiplexed imaging, the researchers dispensed with the conventional one-to-one labeling strategy used in fluorescence microscopy and developed a “one-to-many” strategy for labeling organelles. They stained multiple intracellular compartments with one lipid dye that could stain all the membrane-associated organelles with nearly 100% efficiency. When the emission spectrum of the dye responded to the lipid polarity of the membranes, the researchers applied dual-color imaging channels to obtain high-resolution ratiometric measurements, which allowed them to discriminate between organelles with similar shapes and sizes. By using a single dye label, the researchers overcame the constraints of requiring multiple colors and accelerated the imaging process. The resulting high-resolution images accurately captured the differences between organelles. The researchers used a spinning disk confocal microscope with an extended resolution of about 143 nm for high spatiotemporal acquisition of images. “Current tools such as fluorescence microscopy have limitations in resolution that make it difficult to see the tiny structures within cells or track detailed cellular processes,” Jin said. Using deep convolutional neural networks (DCNN), the researchers segmented up to 15 subcellular structures with one laser excitation and two detection channels. They showed that a simple cell-staining protocol, followed by the rapid acquisition of spatial and spectral imaging data, could empower DCNN to predict 15 intracellular structures with accuracy, speed, reproducibility, stability, and throughput. They resolved the 3D anatomic structure of live cells at different mitotic phases and tracked the fast, dynamic interactions among six intracellular compartments. The multiplexed imaging technique is highly adaptable. The researchers were able to transfer learning to predict 2D and 3D datasets from different microscopes, different cell types, and complex systems of living tissues. The new imaging technique will enable scientists to investigate the 3D structure of live cells during different stages of cell division and observe the rapid interactions among intracellular compartments. Jin said the team is currently working with several medical research institutions that are involved in the exploration of virus-cell interactions and cell defense mechanisms and in the imaging of cardiomyocytes for the study of heart disease. “It’s like taking an airplane over a city at night and watching all the live interactions,” Jin said. “This cutting-edge technology will open new doors in the quest to understand the intricate world within our cells.” The research was published in Nature Communications (www.doi.org/10.1038/s41467-025-57877-5).