Background and Objective: In recent years, machine learning-based clinical decision support systems (CDSS) have played a key role in the analysis of several medical conditions. Despite their promising capabilities, the lack of transparency in AI models poses significant challenges, particularly in medical contexts where reliability is a mandatory aspect. However, it appears that explainability is inversely proportional to accuracy. For this reason, achieving transparency without compromising predictive accuracy remains a key challenge. Methods: This paper presents a novel method, namely Rad4XCNN, to enhance the predictive power of CNN-derived features with the inherent interpretability of radiomic features. Rad4XCNN diverges from conventional methods based on saliency maps, by associating intelligible meaning to CNN-derived features by means of Radiomics, offering new perspectives on explanation methods beyond visualization maps. Results: Using a breast cancer classification task as a case study, we evaluated Rad4XCNN on ultrasound imaging datasets, including an online dataset and two in-house datasets for internal and external validation. Some key results are: (i) CNN-derived features guarantee more robust accuracy when compared against ViT-derived and radiomic features; (ii) conventional visualization map methods for explanation present several pitfalls; (iii) Rad4XCNN does not sacrifice model accuracy for their explainability; (iv) Rad4XCNN provides a global explanation enabling the physician to extract global insights and findings. Conclusions: Our method can mitigate some concerns related to the explainability-accuracy trade-off. This study highlighted the importance of proposing new methods for model explanation without affecting their accuracy.

Prinzi, F., Militello, C., Zarcaro, C., Bartolotta, T.V., Gaglio, S., Vitabile, S. (2025). Rad4XCNN: A new agnostic method for post-hoc global explanation of CNN-derived features by means of Radiomics. COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE, 260 [10.1016/j.cmpb.2024.108576].

Rad4XCNN: A new agnostic method for post-hoc global explanation of CNN-derived features by means of Radiomics

Prinzi, Francesco;Zarcaro, Calogero;Bartolotta, Tommaso Vincenzo;Vitabile, Salvatore
2025-03-01

Abstract

Background and Objective: In recent years, machine learning-based clinical decision support systems (CDSS) have played a key role in the analysis of several medical conditions. Despite their promising capabilities, the lack of transparency in AI models poses significant challenges, particularly in medical contexts where reliability is a mandatory aspect. However, it appears that explainability is inversely proportional to accuracy. For this reason, achieving transparency without compromising predictive accuracy remains a key challenge. Methods: This paper presents a novel method, namely Rad4XCNN, to enhance the predictive power of CNN-derived features with the inherent interpretability of radiomic features. Rad4XCNN diverges from conventional methods based on saliency maps, by associating intelligible meaning to CNN-derived features by means of Radiomics, offering new perspectives on explanation methods beyond visualization maps. Results: Using a breast cancer classification task as a case study, we evaluated Rad4XCNN on ultrasound imaging datasets, including an online dataset and two in-house datasets for internal and external validation. Some key results are: (i) CNN-derived features guarantee more robust accuracy when compared against ViT-derived and radiomic features; (ii) conventional visualization map methods for explanation present several pitfalls; (iii) Rad4XCNN does not sacrifice model accuracy for their explainability; (iv) Rad4XCNN provides a global explanation enabling the physician to extract global insights and findings. Conclusions: Our method can mitigate some concerns related to the explainability-accuracy trade-off. This study highlighted the importance of proposing new methods for model explanation without affecting their accuracy.
mar-2025
Prinzi, F., Militello, C., Zarcaro, C., Bartolotta, T.V., Gaglio, S., Vitabile, S. (2025). Rad4XCNN: A new agnostic method for post-hoc global explanation of CNN-derived features by means of Radiomics. COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE, 260 [10.1016/j.cmpb.2024.108576].
File in questo prodotto:
File Dimensione Formato  
1-s2.0-S0169260724005698-main.pdf

accesso aperto

Tipologia: Versione Editoriale
Dimensione 2.52 MB
Formato Adobe PDF
2.52 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10447/673543
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 6
  • ???jsp.display-item.citation.isi??? 5
social impact