Approximate entropy as a measure of system complexity

Approximate entropy as a measure of system complexity

March 1991 | STEVEN M. PINCUS
Approximate entropy (ApEn) is a measure of system complexity that can classify complex systems using at least 1000 data points, including both deterministic chaotic and stochastic processes. The paper evaluates the reliability of using correlation dimension algorithms to infer deterministic chaos, noting that convergence to a finite value does not necessarily imply a deterministic model. ApEn is introduced as a family of formulas and statistics that can distinguish complex systems with relatively few data points, offering a more robust alternative to traditional methods. ApEn(m, r) is defined as the limit of ApEn(m, r, N) as N approaches infinity, where m is the embedding dimension and r is the tolerance radius. It measures the likelihood that patterns close in m-dimensional space remain close in the next step. ApEn can distinguish between deterministic and stochastic systems, as well as between different types of complex systems, including low-dimensional deterministic systems, periodic systems, high-dimensional chaotic systems, and stochastic systems. The paper demonstrates that ApEn can effectively distinguish between different systems, such as the Rossler model with noise, the Henon map, and the logistic map, as well as the MIX stochastic model. ApEn values for the MIX model increase with the probability of stochastic components, showing its ability to detect changes in system complexity. ApEn is also shown to be effective in classifying heart rate data, distinguishing between healthy and sick neonates. ApEn is contrasted with other entropy measures like K–S entropy, which can be infinite for processes with noise, and is not always reliable for distinguishing complex systems. ApEn provides a more practical and robust measure of system complexity, especially with limited data. The paper also discusses the theoretical foundations of ApEn, including its application to stationary processes and Markov chains, and its potential for further development in statistical analysis and information theory.Approximate entropy (ApEn) is a measure of system complexity that can classify complex systems using at least 1000 data points, including both deterministic chaotic and stochastic processes. The paper evaluates the reliability of using correlation dimension algorithms to infer deterministic chaos, noting that convergence to a finite value does not necessarily imply a deterministic model. ApEn is introduced as a family of formulas and statistics that can distinguish complex systems with relatively few data points, offering a more robust alternative to traditional methods. ApEn(m, r) is defined as the limit of ApEn(m, r, N) as N approaches infinity, where m is the embedding dimension and r is the tolerance radius. It measures the likelihood that patterns close in m-dimensional space remain close in the next step. ApEn can distinguish between deterministic and stochastic systems, as well as between different types of complex systems, including low-dimensional deterministic systems, periodic systems, high-dimensional chaotic systems, and stochastic systems. The paper demonstrates that ApEn can effectively distinguish between different systems, such as the Rossler model with noise, the Henon map, and the logistic map, as well as the MIX stochastic model. ApEn values for the MIX model increase with the probability of stochastic components, showing its ability to detect changes in system complexity. ApEn is also shown to be effective in classifying heart rate data, distinguishing between healthy and sick neonates. ApEn is contrasted with other entropy measures like K–S entropy, which can be infinite for processes with noise, and is not always reliable for distinguishing complex systems. ApEn provides a more practical and robust measure of system complexity, especially with limited data. The paper also discusses the theoretical foundations of ApEn, including its application to stationary processes and Markov chains, and its potential for further development in statistical analysis and information theory.
Reach us at info@futurestudyspace.com
[slides] Approximate entropy as a measure of system complexity. | StudySpace