In lifesaving situations, expedient and accurate diagnostic tools are critical to aid pathologists in examining biopsied tissue samples looking for signs of diseases. UCLA engineers found a new path to achieve that with virtual re-staining of tissue images that is both faster than human-performed special stains and just as accurate.
Under a microscope, pathologists inspect tissue samples from biopsies that have been stained with special dyes to enhance contrast and color. The most common stain used is the hematoxylin and eosin (H&E) stain. However, in many clinical cases, additional special stains are needed to provide added contrast and color to different tissue components. This allows pathologists to get a clearer diagnostic picture. These special stains often require significantly longer time for tissue preparation, along with added effort and monitoring by expert histotechnologists, all of which increase the costs and time for disease diagnosis.
To speed up this process exponentially, UCLA researchers have developed a computational technique, powered by artificial intelligence, which transforms images of tissue previously stained with H&E into new ones with added special stains. The process takes less than one minute per tissue sample, as opposed to several hours or even more than a day when performed by human experts. This speed differential enables faster preliminary diagnoses that require special stains, while also providing significant savings in costs. Nature Communications recently published the research study outlining the new technique and its impact.
“We developed a deep learning-based technique that eliminates the need for special stains to be performed by histotechnologists,” said research lead Aydogan Ozcan, UCLA’s Volgenau Professor for Engineering Innovation at the Electrical and Computer Engineering Department of the Samueli School of Engineering and an associate director of the California NanoSystems Institute (CNSI). “The enhanced speed and accuracy are particularly important when diagnosing medical conditions such as organ transplant rejection cases, where a fast and accurate diagnosis enables rapid treatment, which may lead to greatly improved clinical outcomes.”
Ozcan’s team demonstrated the AI-based technique by generating a full panel of special stains used for kidney tissue — Periodic Acid-Schiff (PAS), Jones silver stain and Masson’s Trichrome. Using specialized deep neural networks trained on existing images of H&E-stained tissue biopsies, the researchers were able to virtually generate these special stains on a variety of clinical samples, covering a broad range of kidney diseases. A multi-institution team of board-certified renal pathologists then performed a clinical evaluation to ascertain the efficacy of the virtual, stain-to-stain transformation technique. They found a statistically significant improvement in the diagnoses achieved by using the virtually generated special stains over the use of only the H&E-stained images of biopsies. An additional study also showed that the quality of the virtually re-stained images is statistically equivalent to those processed with special stains by human experts.
Furthermore, since the technique is applied to existing H&E-stained images, the researchers emphasized that it would be easy to adopt, as it does not require any changes to the current tissue processing workflow used in pathology labs.
In addition to Ozcan, who also holds a faculty appointment in the Bioengineering Department, the research team consists of W. Dean Wallace, a professor of pathology at the USC Keck School of Medicine; and Yair Rivenson, an adjunct professor of electrical and computer engineering at UCLA; as well as UCLA Samueli graduate students Kevin de Haan, Yijie Zhang and Tairan Liu. Clinical validation of this virtual re-staining method was advised by Dr. Jonathan Zuckerman of the UCLA Department of Pathology and Laboratory Medicine at the David Geffen School of Medicine.
The research was supported by the National Science Foundation’s Biophotonics Program.
View original article here Source