Cascaded Neural Networks Help Virtually Re-Stain Tissue Samples
Using a cascaded deep neural network structure, a UCLA research group led by professor Aydogan Ozcan developed a computational approach for chemical-free re-staining of tissue specimens. The AI-powered technique to virtual stain transfer provided high-quality virtual images of different stains using existing, histochemically stained slides. It is a repeatable process that saves time and costs, reduces waste, and preserves the biopsied tissue so that it can be used for additional testing.
To diagnose disease, pathologists visually inspect tissue specimens taken from the patient. They may use different types of stains to bring contrast to and highlight various histological features of the tissue. The histochemical staining procedures are usually irreversible, making it difficult to obtain multiple stains on the same thin section of tissue.
The Ozcan-led team built a virtual stain transfer framework using a cascade of two deep neural networks and demonstrated the ability of the framework to digitally transform hematoxylin- and eosin- (H&E) stained tissue images into other types of histological stains. During the training process, the cascaded deep neural network structure first learned to use virtual staining to transform autofluorescence microscopy images into H&E, and then learned to execute a stain transfer from H&E to the domain of another stain in a cascaded manner.
By implementing a cascaded structure during the training phase, the researchers enabled the model to directly exploit histochemically stained image data on both H&E and the other stain of interest — in this case, the periodic acid-Schiff (PAS) stain. The cascaded training strategy helped to mitigate the challenge of paired data acquisition when using histochemically stained slides and improved the image quality and color accuracy of the virtual stain transfer from H&E to a different stain.
A tissue slide can be stained once, with one type of stain. Washing away the existing stain and putting a new chemical stain in its place is difficult and not often practiced in clinical settings. Previous methods of virtual stain transfer were stymied by this issue, which also made acquiring paired images of different stain types problematic.
A virtual tissue re-staining method saves biopsied tissue for more advanced diagnostic tests to be performed, eliminating the need for a second, potentially unnecessary biopsy. The virtual tissue re-staining method could be applied to other types of stains used in histology, and the method for virtual stain transfer could lead to new opportunities in digital pathology and tissue-based diagnostics. This image depicts the virtual re-staining of tissue using cascaded deep neural networks. Courtesy of the Ozcan Lab at UCLA.
The Ozcan team validated the performance of its cascaded deep neural network approach using kidney needle core biopsy tissue sections. The researchers demonstrated the successful transfer of H&E-stained tissue images into a virtual PAS stain.
In giving clinicians the option to virtually re-stain tissue, the method for virtual stain transfer could lead to new opportunities in digital pathology and tissue-based diagnostics. Additionally, the virtual re-staining method could be applied to other types of stains used in histology.
The research was published in
ACS Photonics (www.doi.org/10.1021/acsphotonics.2c00932).
LATEST NEWS