Document Type

Article

Publication Date

4-2022

DOI

10.3390/e24040524

Publication Title

Entropy

Volume

24

Issue

4

Pages

524 (25 pp.)

Abstract

: Sample entropy, an approximation of the Kolmogorov entropy, was proposed to characterize complexity of a time series, which is essentially defined as − log(B/A), where B denotes the number of matched template pairs with length m and A denotes the number of matched template pairs with m + 1, for a predetermined positive integer m. It has been widely used to analyze physiological signals. As computing sample entropy is time consuming, the box-assisted, bucket-assisted, x-sort, assisted sliding box, and kd-tree-based algorithms were proposed to accelerate its computation. These algorithms require O(N2) or O(N2− 1/m+1 ) computational complexity, where N is the length of the time series analyzed. When N is big, the computational costs of these algorithms are large. We propose a super fast algorithm to estimate sample entropy based on Monte Carlo, with computational costs independent of N (the length of the time series) and the estimation converging to the exact sample entropy as the number of repeating experiments becomes large. The convergence rate of the algorithm is also established. Numerical experiments are performed for electrocardiogram time series, electroencephalogram time series, cardiac inter-beat time series, mechanical vibration signals (MVS), meteorological data (MD), and 1/ f noise. Numerical results show that the proposed algorithm can gain 100–1000 times speedup compared to the kd-tree and assisted sliding box algorithms while providing satisfactory approximate accuracy.

Comments

Copyright: © 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https:// creativecommons.org/licenses/by/ 4.0/).

Original Publication Citation

Liu, W., Jiang, Y., & Xu, Y. (2022). A super fast algorithm for estimating sample entropy. Entropy, 24(4), Article 524. https://www.mdpi.com/1099-4300/24/4/524

Share

COinS