Researchers from the University of Texas at Dallas have recently used a unique imaging technology and artificial intelligence (AI) to predict the presence of cancer cells in tissue samples. This hyperspectral imaging technique is effective in satellite imagery and orbiting telescopes, and based on these findings, it could be used to quickly identify cancer cells in the operating room as well. This research was published on September 14 in the journal Cancers and was conducted by Dr. Baowei Fei, professor of bioengineering and the Cecil H. and Ida Green Chair in Systems Biology Science in the Erik Jonsson School of Engineering and Computer Science.
Analyzing 293 tissue samples from 102 head and neck cancer surgery patients, Fei and colleagues found that hyperspectral imaging and AI could be used to predict cancer cell presence with 80-90% accuracy. Fei recently received a $1.6 million grant from the Cancer Prevention & Research Institute of Texas (CPRIT) to continue improving this smart surgical microscope. Once this approach is fully developed, it would need to be tested in clinical studies before being used in the live healthcare environment.
“We hope that this technology can help surgeons better detect cancer during surgery, reduce operating time, lower medical costs and save lives,” explained Fei. “Hyperspectral imaging is noninvasive, portable and does not require radiation or a contrast agent.”
Intraoperative frozen section analysis is the technique currently used by pathologists to analyze tissue samples who are undergoing cancer surgery. This is done while the patient is still under anesthesia and can require several resections before the surgeon reaches nonmalignant tissue. If cancer cells are not sampled or detected during surgery, additional operations can be required as well.
The hyperspectral imaging technique used in this study examines cells under ultraviolet and near-infrared lights at micrometer resolution. Cells reflect and absorb light in a manner that can provide a unique spectral image for the individual. Fei noted that this grant will go towards
training the microscope’s AI to recognize cancer using images of malignant and healthy cells.
“If we have a large database that knows what is normal tissue and what is cancerous tissue, then we can train our system to learn the features of the spectra,” Fei said. “Once it’s trained, the smart device can predict whether a new sample is a cancerous tissue or not. That’s how machine learning can help with a cancer diagnosis.”
The smart surgical microscope is expected to yield results instantaneously, significantly cutting down the surgery time and cost. Each frozen resection evaluation can take 30-45 minutes, making the surgery a time-consuming process even if repeated resection is not required. By decreasing the amount of time spent in the operating room, this AI-enabled device could lower the amount of time a patient is under anesthesia and the associated risks.
“We hope that this technology can help surgeons better detect cancer during surgery, reduce operating time, lower medical costs and save lives.”
– Dr. Baowei Fei
— UT Dallas (@UT_Dallas) September 30, 2019