RSS-Feed abonnieren
DOI: 10.1055/s-0045-1805184
The association between heatmap position and the diagnostic accuracy of artificial intelligence algorithms for the characterization of colorectal polyps
Aims Artificial intelligence (AI) algorithms diagnosing colorectal polyps are emerging but not being widely used in clinical practice, likely due to a lack of trust in AI. Explainable AI is a solution to increase trust in AI by enhancing its transparency, potentially improving the interaction between AI and endoscopists [1]. Heatmaps are an example of visually explainable AI [2] [3]. Knowledge regarding correct interpretation of heatmaps is required to clarify if repositioning the endoscope if the heatmap does not cover the polyp correctly, could increase chances of an accurate characterization. This study aims to investigate the association between heatmap position and AI accuracy for the endoscopic characterization of colorectal polyps.
Methods A test dataset with images of colorectal polyps was collected prospectively in two Dutch hospitals between September 2022 and January 2024. Four AI algorithms were trained with a separate training dataset. All four algorithms were trained to characterize the images in the test dataset as benign (hyperplastic polyps) or premalignant (sessile serrated lesions and adenomas). Histopathology was used as the gold standard. The algorithms provided heatmaps obtained with Grad-CAM [3]. Heatmap position was compared to human-annotated polyp position. The percentage of heatmap covering polyp was calculated as the ratio between the overlap of the heatmap and polyp and the joint heatmap area, multiplied by 100. The percentage of polyp not covered by heatmap was calculated as the ratio between the part of the polyp not covered by heatmap and the entire polyp area, multiplied by 100. Generalized estimating equations was used to assess the association between heatmap position and a correct AI diagnosis.
Results In total, 2133 images were collected of 376 colorectal polyps. The majority of colorectal polyps were diminutive (90.6%). Higher percentages of heatmap covering the colorectal polyp are associated with correct diagnoses in all four algorithms (OR 1.013 [95% CI 1.006-1.019], OR 1.025 [95% CI 1.011-1.039], OR 1.038 [95% CI 1.024-1.053], OR 1.039 [95% CI 1.020-1.058], all p<0.001). A higher percentage of polyp not covered by heatmap was associated with a correct diagnosis of Algorithm 1 (OR 1.006 [95% CI 1.003-1.010], p<0.001), while in Algorithm 2 a lower percentage was associated with correct diagnosis (OR 0.992 [95% CI 0.985-1.000], p 0.044). Algorithm 3 and 4 showed negative, but not statistically significant associations.
Conclusions Higher percentages of heatmap covering the polyp are associated with correct diagnoses of four AI algorithms. This indicates that it is clinically relevant to strive for AI predictions with heatmaps covering as much colorectal polyp tissue and as little surrounding colon tissue as possible. As long as the heatmap consists of colorectal polyp tissue, it seems less important if there is an additional part of the polyp not covered by heatmap. These results contribute to the optimal use of AI algorithms for colorectal polyps. Knowing how to interpret heatmaps could increase trust in AI and, with that, benefit implementation of AI in clinical practice.
Publikationsverlauf
Artikel online veröffentlicht:
27. März 2025
© 2025. European Society of Gastrointestinal Endoscopy. All rights reserved.
Georg Thieme Verlag KG
Oswald-Hesse-Straße 50, 70469 Stuttgart, Germany
-
References
- 1 Selvaraju RR, Cogswell M, Das A, Vedantam R, Parikh D, Batra D.. Grad-cam: Visual explanations from deep networks via gradient-based localization. International journal of computer vision 2020; 128: 336-59
- 2 Mori Y, Jin EH, Lee D.. Enhancing artificial intelligence-doctor collaboration for computer-aided diagnosis in colonoscopy through improved digital literacy. Dig Liver Dis 2024; 56: 1140-1143
- 3 Jin EH, Lee D, Bae JH. et al. Improved Accuracy in Optical Diagnosis of Colorectal Polyps Using Convolutional Neural Networks with Visual Explanations. Gastroenterology 2020; 158: 2169-2179 e2168