This study introduces deep convolutional neural network-based methods for the detection and classification of skin lesions, enhancing system accuracy through a combination of architectures, pre-processing techniques and data augmentation. Multiple networks, including XceptionNet, DenseNet, MobileNet, NASNet Mobile, and EfficientNet, were evaluated to test deep learning’s potential in complex, multi-class classification tasks. Training these models on pre-processed datasets with optimized hyper-parameters (e.g., batch size, learning rate, and dropout) improved classification precision for early-stage skin cancers. Evaluation measures such as accuracy and loss confirmed high classification efficiency with minimal overfitting, as the validation results aligned closely with training. DenseNet-201 and MobileNet-V3 Large demonstrated strong generalization abilities, whereas EfficientNetV2-B3 and NASNet Mobile achieved the best balance between accuracy and efficiency. The application of different augmentation rates per class also enhanced the handling of imbalanced data, resulting in more accurate large-scale detection. Comprehensive pre-processing ensured balanced class representation, and EfficientNetV2 models achieved exceptional classification accuracy, attributed to their optimized architecture balancing depth, width, and resolution. These models showed high convergence rates and generalization, supporting their suitability for medical imaging tasks using transfer learning.

Hussain, S.I., Toscano, E. (2025). Enhancing Recognition and Categorization of Skin Lesions with Tailored Deep Convolutional Networks and Robust Data Augmentation Techniques. MATHEMATICS, 13(9) [10.3390/math13091480].

Enhancing Recognition and Categorization of Skin Lesions with Tailored Deep Convolutional Networks and Robust Data Augmentation Techniques

Hussain, Syed Ibrar
;
Toscano, Elena
2025-04-30

Abstract

This study introduces deep convolutional neural network-based methods for the detection and classification of skin lesions, enhancing system accuracy through a combination of architectures, pre-processing techniques and data augmentation. Multiple networks, including XceptionNet, DenseNet, MobileNet, NASNet Mobile, and EfficientNet, were evaluated to test deep learning’s potential in complex, multi-class classification tasks. Training these models on pre-processed datasets with optimized hyper-parameters (e.g., batch size, learning rate, and dropout) improved classification precision for early-stage skin cancers. Evaluation measures such as accuracy and loss confirmed high classification efficiency with minimal overfitting, as the validation results aligned closely with training. DenseNet-201 and MobileNet-V3 Large demonstrated strong generalization abilities, whereas EfficientNetV2-B3 and NASNet Mobile achieved the best balance between accuracy and efficiency. The application of different augmentation rates per class also enhanced the handling of imbalanced data, resulting in more accurate large-scale detection. Comprehensive pre-processing ensured balanced class representation, and EfficientNetV2 models achieved exceptional classification accuracy, attributed to their optimized architecture balancing depth, width, and resolution. These models showed high convergence rates and generalization, supporting their suitability for medical imaging tasks using transfer learning.
30-apr-2025
Settore MATH-05/A - Analisi numerica
Hussain, S.I., Toscano, E. (2025). Enhancing Recognition and Categorization of Skin Lesions with Tailored Deep Convolutional Networks and Robust Data Augmentation Techniques. MATHEMATICS, 13(9) [10.3390/math13091480].
File in questo prodotto:
File Dimensione Formato  
mathematics-13-01480 (3)_c.pdf

accesso aperto

Tipologia: Versione Editoriale
Dimensione 520.2 kB
Formato Adobe PDF
520.2 kB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10447/690465
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 3
  • ???jsp.display-item.citation.isi??? 3
social impact