Nonlinear techniques have found an increasing interest in the dynamical analysis of various kinds of systems. Among these techniques, entropy-based metrics have emerged as practical alternatives to classical techniques due to their wide applicability in different scenarios, especially to short and noisy processes. Issued from information theory, entropy approaches are of great interest to evaluate the degree of irregularity and complexity of physical, physiological, social, and econometric systems. Based on Shannon entropy and conditional entropy (CE), various techniques have been proposed; among them, approximate entropy, sample entropy, fuzzy entropy, distribution entropy, permutation entropy, and dispersion entropy are probably the most well known. After a presentation of the basic information-theoretic functionals, these measures are detailed, together with recent proposals inspired by nearest neighbors and parametric approaches. Moreover, the role of dimension, data length, and parameters in using these measures is described. Their computational efficiency is also commented. Finally, the limitations and advantages of the above-mentioned entropy measures for practical use are discussed. TheMatlab codes used in this chapter are available at https://github.com/HamedAzami/Univariate Entropy Methods.

Azami H., Faes L., Escudero J., Humeau-Heurtier A., Silva L.E.V. (2022). Entropy Analysis of Univariate Biomedical Signals: Review and Comparison of Methods. In Willi Freeden and M Zuhair Nashed (a cura di), Frontiers In Entropy Across The Disciplines - Panorama Of Entropy: Theory, Computation, And Applications (pp. 233-286). World Scientific Publishing Co. Pte. Ltd..

Entropy Analysis of Univariate Biomedical Signals: Review and Comparison of Methods

Faes L.;
2022-10-01

Abstract

Nonlinear techniques have found an increasing interest in the dynamical analysis of various kinds of systems. Among these techniques, entropy-based metrics have emerged as practical alternatives to classical techniques due to their wide applicability in different scenarios, especially to short and noisy processes. Issued from information theory, entropy approaches are of great interest to evaluate the degree of irregularity and complexity of physical, physiological, social, and econometric systems. Based on Shannon entropy and conditional entropy (CE), various techniques have been proposed; among them, approximate entropy, sample entropy, fuzzy entropy, distribution entropy, permutation entropy, and dispersion entropy are probably the most well known. After a presentation of the basic information-theoretic functionals, these measures are detailed, together with recent proposals inspired by nearest neighbors and parametric approaches. Moreover, the role of dimension, data length, and parameters in using these measures is described. Their computational efficiency is also commented. Finally, the limitations and advantages of the above-mentioned entropy measures for practical use are discussed. TheMatlab codes used in this chapter are available at https://github.com/HamedAzami/Univariate Entropy Methods.
ott-2022
Azami H., Faes L., Escudero J., Humeau-Heurtier A., Silva L.E.V. (2022). Entropy Analysis of Univariate Biomedical Signals: Review and Comparison of Methods. In Willi Freeden and M Zuhair Nashed (a cura di), Frontiers In Entropy Across The Disciplines - Panorama Of Entropy: Theory, Computation, And Applications (pp. 233-286). World Scientific Publishing Co. Pte. Ltd..
File in questo prodotto:
File Dimensione Formato  
B08-EntropyAnalysis_proofs-ch09.pdf

Solo gestori archvio

Tipologia: Versione Editoriale
Dimensione 1.2 MB
Formato Adobe PDF
1.2 MB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10447/586177
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 17
  • ???jsp.display-item.citation.isi??? ND
social impact