Cargando…
Interpretation of Entropy Algorithms in the Context of Biomedical Signal Analysis and Their Application to EEG Analysis in Epilepsy
Biomedical signals are measurable time series that describe a physiological state of a biological system. Entropy algorithms have been previously used to quantify the complexity of biomedical signals, but there is a need to understand the relationship of entropy to signal processing concepts. In thi...
Autores principales: | , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2019
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7515369/ http://dx.doi.org/10.3390/e21090840 |
Sumario: | Biomedical signals are measurable time series that describe a physiological state of a biological system. Entropy algorithms have been previously used to quantify the complexity of biomedical signals, but there is a need to understand the relationship of entropy to signal processing concepts. In this study, ten synthetic signals that represent widely encountered signal structures in the field of signal processing were created to interpret permutation, modified permutation, sample, quadratic sample and fuzzy entropies. Subsequently, the entropy algorithms were applied to two different databases containing electroencephalogram (EEG) signals from epilepsy studies. Transitions from randomness to periodicity were successfully detected in the synthetic signals, while significant differences in EEG signals were observed based on different regions and states of the brain. In addition, using results from one entropy algorithm as features and the k-nearest neighbours algorithm, maximum classification accuracies in the first EEG database ranged from 63% to 73.5%, while these values increased by approximately 20% when using two different entropies as features. For the second database, maximum classification accuracy reached 62.5% using one entropy algorithm, while using two algorithms as features further increased that by 10%. Embedding entropies (sample, quadratic sample and fuzzy entropies) are found to outperform the rest of the algorithms in terms of sensitivity and show greater potential by considering the fine-tuning possibilities they offer. On the other hand, permutation and modified permutation entropies are more consistent across different input parameter values and considerably faster to calculate. |
---|