
Multimodal artificial intelligence (AI) has proven to be an effective tool for endoscopists of all expertise levels in the interpretation of imaging and the diagnosis of solid pancreatic lesions, according to new research.
A randomized crossover trial led by Haochen Cui, MD, and colleagues aimed to assess whether the addition of multimodal AI, in collaboration with endoscopists of different experience levels, is more effective in providing a diagnosis of solid lesions in the pancreas than using endoscopic ultrasonographic (EUS) imaging alone.
The 12 endoscopists of varying expertise (expert, senior, and novice) from nine centers were assigned to diagnose solid lesions in the pancreas with or without AI assistance based on a prospective dataset of 130 patients from four institutions across China who underwent EUS procedures collected between January 1 and June 31, 2023.
The researchers developed three AI models to integrate the information from both EUS imaging and clinical data:
- Model 1: A convolutional neural network model was trained on EUS images using Microsoft Visio with transfer learning.
- Model 2: Machine learning models were trained on 36 clinical features from five categories.
- Model 3: A multilayer perceptron model with two fully connected layers used three data fusion strategies to combine features, probabilities, or predictions from models 1 and 2.
When solely EUS images were provided, model 1 outperformed the endoscopists, which was more sensitive (AUC, 0.93; 95% CI, 0.85-0.97) than expert endoscopists (AUC, 0.74; 95% CI, 0.59-0.85; P=.02), senior endoscopists (AUC, 0.62; 95% CI, 0.50-0.73; P<.001), and novice endoscopists (AUC, 0.56; 95% CI, 0.46-0.66; P<.001).
When diagnoses were based on clinical information and EUS images, model 3 remained superior in diagnostic performance. Model 3 was more sensitive than senior endoscopists (AUC, 0.92; 95% CI, 0.84-0.96 vs AUC, 0.72; 95% CI, 0.61-0.82; P=.002) and more accurate than senior endoscopists (AUC, 0.92; 95% CI, 0.86-0.96 vs AUC, 0.77; 95% CI, 0.68-0.84; P=.001). It was also more sensitive than novice endoscopists (AUC, 0.61; 95% CI, 0.51-0.70; P<.001) and more accurate than novice endoscopists (AUC, 0.69; 95% CI, 0.61-0.76; P<.001).
With additional assistance from AI, novice endoscopists demonstrated a significant improvement in sensitivity (AUC, 0.91; 95% CI, 0.83-0.95; P<.001) and accuracy (AUC, 0.90; 95% CI, 0.83- 0.94; P<.001).
“In the future, this joint-AI model, with its enhanced transparency in the decision-making process, has the potential to facilitate the diagnosis of solid lesions in the pancreas,” the authors concluded.