Google researchers have recently worked with Northwestern Medicine to create an AI system (Artificial Intelligence) that detects lung cancer more accurately than human radiologists. A deep-learning algorithm was used to train this system, which interprets computed tomography (CT) scans to predict one’s likelihood of having the disease. Daniel Tse, product manager at Google Brain, is the corresponding author of the study, which appeared on May 20 in the journal Nature Medicine.
Tse and the research team applied deep learning AI to 42,290 low-dose CT (LDCT) scans that were provided by the Northwestern Electronic Data Warehouse and other sources belonging to Northwestern hospitals in Chicago. The images were taken from almost 15,000 patients from a National Institutes of Health study conducted in 2002, with 578 of these patients developing cancer within a year.
These scans are used to show the unregulated proliferation occurring in cancerous tissues and is therefore a powerful tool in detecting lung cancer. Some argue that LDCT scans are superior to X-rays in detecting lung cancers, with research showing that LDCT scans may reduce lung cancer fatality by 20 percent. Being that these scans can be very difficult to read, the use of AI-powered tools offers a means of enforcing interpretation of the scans.
In the Google-funded study, the researchers utilized the AI as a diagnostic tool that evaluated these images and predicted the likelihood of malignancy with no human opinion. The team compared the deep learning AI’s prognoses to those of six certified radiologists who have had clinical experience for up to 20 years.
Analyzing a single CT scan, the AI model detected lung cancers 5 percent more often than the experts and was 11 percent more likely to decrease the rate of false positives. The physicians and AI showed similar rates of diagnosis when they were able to utilize prior scans.
“By showing that deep learning can increase specificity without sacrificing sensitivity, we hope to spur more research and conversation around the role AI can play in tipping the cost-benefit scale for cancer screening,” said Tse and Google technical lead Shravya Shetty in a blog post.
This AI system will be made available via the Google Cloud Healthcare API as the company continues to conduct trials and additional research.
“The AI system uses 3D volumetric deep learning to analyze the full anatomy on chest CT scans, as well as patches based on object detection techniques that identify regions with malignant lesions,” the blog post reads.
Co-author Dr. Mozziyar Etemadi, research assistant professor of anesthesiology at Northwestern University Feinberg School of Medicine in Chicago offers some insight as to how AI exceeds human evaluation in accuracy.
“Radiologists generally examine hundreds of 2D images or ‘slices’ in a single CT scan, but this new machine learning system views the lungs in a huge, single 3D image,” Etemadi says. “AI in 3D can be much more sensitive in its ability to detect early lung cancer than the human eye looking at 2D images. This is technically ‘4D’ because it is not only looking at one CT scan but two (the current and prior scan) over time.”
Google’s has made other headlines in the digital healthcare field recently, with their use of AI in screening diabetic patients for eye disease and in creating apps to aid those with hearing impairments.
Today in @NatureMedicine, we’re publishing our progress towards modeling lung cancer prediction as well as laying the groundwork for future clinical testing. Check out the paper below, and learn more at https://t.co/hWE7mOcBL2 #MachineLearning https://t.co/n5t0jp8JiB
— Google AI (@GoogleAI) May 20, 2019