PurposeTo propose a new quality scoring tool, METhodological RadiomICs Score (METRICS), to assess and improve research quality of radiomics studies.MethodsWe conducted an online modified Delphi study with a group of international experts. It was performed in three consecutive stages: Stage#1, item preparation; Stage#2, panel discussion among EuSoMII Auditing Group members to identify the items to be voted; and Stage#3, four rounds of the modified Delphi exercise by panelists to determine the items eligible for the METRICS and their weights. The consensus threshold was 75%. Based on the median ranks derived from expert panel opinion and their rank-sum based conversion to importance scores, the category and item weights were calculated.ResultIn total, 59 panelists from 19 countries participated in selection and ranking of the items and categories. Final METRICS tool included 30 items within 9 categories. According to their weights, the categories were in descending order of importance: study design, imaging data, image processing and feature extraction, metrics and comparison, testing, feature processing, preparation for modeling, segmentation, and open science. A web application and a repository were developed to streamline the calculation of the METRICS score and to collect feedback from the radiomics community.ConclusionIn this work, we developed a scoring tool for assessing the methodological quality of the radiomics research, with a large international panel and a modified Delphi protocol. With its conditional format to cover methodological variations, it provides a well-constructed framework for the key methodological concepts to assess the quality of radiomic research papers.Critical relevance statementA quality assessment tool, METhodological RadiomICs Score (METRICS), is made available by a large group of international domain experts, with transparent methodology, aiming at evaluating and improving research quality in radiomics and machine learning.Key points center dot A methodological scoring tool, METRICS, was developed for assessing the quality of radiomics research, with a large international expert panel and a modified Delphi protocol.center dot The proposed scoring tool presents expert opinion-based importance weights of categories and items with a transparent methodology for the first time.center dot METRICS accounts for varying use cases, from handcrafted radiomics to entirely deep learning-based pipelines.center dot A web application has been developed to help with the calculation of the METRICS score (https://metricsscore.github.io/metrics/METRICS.html) and a repository created to collect feedback from the radiomics community (https://github.com/metricsscore/metrics).Key points center dot A methodological scoring tool, METRICS, was developed for assessing the quality of radiomics research, with a large international expert panel and a modified Delphi protocol.center dot The proposed scoring tool presents expert opinion-based importance weights of categories and items with a transparent methodology for the first time.center dot METRICS accounts for varying use cases, from handcrafted radiomics to entirely deep learning-based pipelines.center dot A web application has been developed to help with the calculation of the METRICS score (https://metricsscore.github.io/metrics/METRICS.html) and a repository created to collect feedback from the radiomics community (https://github.com/metricsscore/metrics).Key points center dot A methodological scoring tool, METRICS, was developed for assessing the quality of radiomics research, with a large international expert panel and a modified Delphi protocol.center dot The proposed scoring tool presents expert opinion-based importance weights of categories and items with a transparent methodology for the first time.center dot METRICS accounts for varying use cases, from handcrafted radiomics to entirely deep learning-based pipelines.center dot A web application has been developed to help with the calculation of the METRICS score (https://metricsscore.github.io/metrics/METRICS.html) and a repository created to collect feedback from the radiomics community (https://github.com/metricsscore/metrics).Key points center dot A methodological scoring tool, METRICS, was developed for assessing the quality of radiomics research, with a large international expert panel and a modified Delphi protocol.center dot The proposed scoring tool presents expert opinion-based importance weights of categories and items with a transparent methodology for the first time.center dot METRICS accounts for varying use cases, from handcrafted radiomics to entirely deep learning-based pipelines.center dot A web application has been developed to help with the calculation of the METRICS score (https://metricsscore.github.io/metrics/METRICS.html) and a repository created to collect feedback from the radiomics community (https://github.com/metricsscore/metrics).

Kocak B., Akinci D'Antonoli T., Mercaldo N., Alberich-Bayarri A., Baessler B., Ambrosini I., et al. (2024). METhodological RadiomICs Score (METRICS): a quality scoring tool for radiomics research endorsed by EuSoMII. INSIGHTS INTO IMAGING, 15(1), 1-18 [10.1186/s13244-023-01572-w].

METhodological RadiomICs Score (METRICS): a quality scoring tool for radiomics research endorsed by EuSoMII

Cannella R.;Vernuccio F.;
2024-01-17

Abstract

PurposeTo propose a new quality scoring tool, METhodological RadiomICs Score (METRICS), to assess and improve research quality of radiomics studies.MethodsWe conducted an online modified Delphi study with a group of international experts. It was performed in three consecutive stages: Stage#1, item preparation; Stage#2, panel discussion among EuSoMII Auditing Group members to identify the items to be voted; and Stage#3, four rounds of the modified Delphi exercise by panelists to determine the items eligible for the METRICS and their weights. The consensus threshold was 75%. Based on the median ranks derived from expert panel opinion and their rank-sum based conversion to importance scores, the category and item weights were calculated.ResultIn total, 59 panelists from 19 countries participated in selection and ranking of the items and categories. Final METRICS tool included 30 items within 9 categories. According to their weights, the categories were in descending order of importance: study design, imaging data, image processing and feature extraction, metrics and comparison, testing, feature processing, preparation for modeling, segmentation, and open science. A web application and a repository were developed to streamline the calculation of the METRICS score and to collect feedback from the radiomics community.ConclusionIn this work, we developed a scoring tool for assessing the methodological quality of the radiomics research, with a large international panel and a modified Delphi protocol. With its conditional format to cover methodological variations, it provides a well-constructed framework for the key methodological concepts to assess the quality of radiomic research papers.Critical relevance statementA quality assessment tool, METhodological RadiomICs Score (METRICS), is made available by a large group of international domain experts, with transparent methodology, aiming at evaluating and improving research quality in radiomics and machine learning.Key points center dot A methodological scoring tool, METRICS, was developed for assessing the quality of radiomics research, with a large international expert panel and a modified Delphi protocol.center dot The proposed scoring tool presents expert opinion-based importance weights of categories and items with a transparent methodology for the first time.center dot METRICS accounts for varying use cases, from handcrafted radiomics to entirely deep learning-based pipelines.center dot A web application has been developed to help with the calculation of the METRICS score (https://metricsscore.github.io/metrics/METRICS.html) and a repository created to collect feedback from the radiomics community (https://github.com/metricsscore/metrics).Key points center dot A methodological scoring tool, METRICS, was developed for assessing the quality of radiomics research, with a large international expert panel and a modified Delphi protocol.center dot The proposed scoring tool presents expert opinion-based importance weights of categories and items with a transparent methodology for the first time.center dot METRICS accounts for varying use cases, from handcrafted radiomics to entirely deep learning-based pipelines.center dot A web application has been developed to help with the calculation of the METRICS score (https://metricsscore.github.io/metrics/METRICS.html) and a repository created to collect feedback from the radiomics community (https://github.com/metricsscore/metrics).Key points center dot A methodological scoring tool, METRICS, was developed for assessing the quality of radiomics research, with a large international expert panel and a modified Delphi protocol.center dot The proposed scoring tool presents expert opinion-based importance weights of categories and items with a transparent methodology for the first time.center dot METRICS accounts for varying use cases, from handcrafted radiomics to entirely deep learning-based pipelines.center dot A web application has been developed to help with the calculation of the METRICS score (https://metricsscore.github.io/metrics/METRICS.html) and a repository created to collect feedback from the radiomics community (https://github.com/metricsscore/metrics).Key points center dot A methodological scoring tool, METRICS, was developed for assessing the quality of radiomics research, with a large international expert panel and a modified Delphi protocol.center dot The proposed scoring tool presents expert opinion-based importance weights of categories and items with a transparent methodology for the first time.center dot METRICS accounts for varying use cases, from handcrafted radiomics to entirely deep learning-based pipelines.center dot A web application has been developed to help with the calculation of the METRICS score (https://metricsscore.github.io/metrics/METRICS.html) and a repository created to collect feedback from the radiomics community (https://github.com/metricsscore/metrics).
17-gen-2024
Kocak B., Akinci D'Antonoli T., Mercaldo N., Alberich-Bayarri A., Baessler B., Ambrosini I., et al. (2024). METhodological RadiomICs Score (METRICS): a quality scoring tool for radiomics research endorsed by EuSoMII. INSIGHTS INTO IMAGING, 15(1), 1-18 [10.1186/s13244-023-01572-w].
File in questo prodotto:
File Dimensione Formato  
13244_2023_Article_1572.pdf

accesso aperto

Tipologia: Versione Editoriale
Dimensione 5.01 MB
Formato Adobe PDF
5.01 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10447/639582
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 19
  • ???jsp.display-item.citation.isi??? 11
social impact