We present a framework for quantifying the dynamics of information in coupled physiological systems based on the notion of conditional entropy (CondEn). First, we revisit some basic concepts of information dynamics, providing definitions of self entropy (SE), cross entropy (CE) and transfer entropy (TE) as measures of information storage and transfer in bivariate systems. We discuss also the generalization to multivariate systems, showing the importance of SE, CE and TE as relevant factors in the decomposition of the system predictive information. Then, we show how all these measures can be expressed in terms of CondEn, and devise accordingly a framework for their data-efficient estimation. The framework builds on a CondEn estimator that follows a sequential conditioning procedure whereby the conditioning vectors are formed progressively according to a criterion for CondEn minimization, and performs a compensation for the bias occurring for conditioning vectors of increasing dimension. The framework is illustrated on numerical examples showing its capability to deal with the curse of dimensionality in the multivariate computation of CondEn, and to reliably estimate SE, CE and TE in the challenging conditions of biomedical time series analysis featuring noise and small sample size. Finally, we illustrate the practical application of the presented framework to cardiovascular and neural time series, reporting some applicative examples in which SE, CE and TE are estimated to quantify the information dynamics of the underlying physiological systems. © 2014 Springer-Verlag Berlin Heidelberg.
Faes, L., Porta, A. (2014). Conditional entropy-based evaluation of information dynamics in physiological systems. In Understanding Complex Systems (pp. 61-86). Springer Verlag [10.1007/978-3-642-54474-3-3].
Conditional entropy-based evaluation of information dynamics in physiological systems
Faes, Luca;
2014-01-01
Abstract
We present a framework for quantifying the dynamics of information in coupled physiological systems based on the notion of conditional entropy (CondEn). First, we revisit some basic concepts of information dynamics, providing definitions of self entropy (SE), cross entropy (CE) and transfer entropy (TE) as measures of information storage and transfer in bivariate systems. We discuss also the generalization to multivariate systems, showing the importance of SE, CE and TE as relevant factors in the decomposition of the system predictive information. Then, we show how all these measures can be expressed in terms of CondEn, and devise accordingly a framework for their data-efficient estimation. The framework builds on a CondEn estimator that follows a sequential conditioning procedure whereby the conditioning vectors are formed progressively according to a criterion for CondEn minimization, and performs a compensation for the bias occurring for conditioning vectors of increasing dimension. The framework is illustrated on numerical examples showing its capability to deal with the curse of dimensionality in the multivariate computation of CondEn, and to reliably estimate SE, CE and TE in the challenging conditions of biomedical time series analysis featuring noise and small sample size. Finally, we illustrate the practical application of the presented framework to cardiovascular and neural time series, reporting some applicative examples in which SE, CE and TE are estimated to quantify the information dynamics of the underlying physiological systems. © 2014 Springer-Verlag Berlin Heidelberg.File | Dimensione | Formato | |
---|---|---|---|
B04-SpringerChapter.pdf
Solo gestori archvio
Dimensione
503.82 kB
Formato
Adobe PDF
|
503.82 kB | Adobe PDF | Visualizza/Apri Richiedi una copia |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.