Cargando…

Selection of entropy-measure parameters for knowledge discovery in heart rate variability data

BACKGROUND: Heart rate variability is the variation of the time interval between consecutive heartbeats. Entropy is a commonly used tool to describe the regularity of data sets. Entropy functions are defined using multiple parameters, the selection of which is controversial and depends on the intend...

Descripción completa

Detalles Bibliográficos
Autores principales: Mayer, Christopher C, Bachler, Martin, Hörtenhuber, Matthias, Stocker, Christof, Holzinger, Andreas, Wassertheurer, Siegfried
Formato: Online Artículo Texto
Lenguaje:English
Publicado: BioMed Central 2014
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4140209/
https://www.ncbi.nlm.nih.gov/pubmed/25078574
http://dx.doi.org/10.1186/1471-2105-15-S6-S2
_version_ 1782331488093601792
author Mayer, Christopher C
Bachler, Martin
Hörtenhuber, Matthias
Stocker, Christof
Holzinger, Andreas
Wassertheurer, Siegfried
author_facet Mayer, Christopher C
Bachler, Martin
Hörtenhuber, Matthias
Stocker, Christof
Holzinger, Andreas
Wassertheurer, Siegfried
author_sort Mayer, Christopher C
collection PubMed
description BACKGROUND: Heart rate variability is the variation of the time interval between consecutive heartbeats. Entropy is a commonly used tool to describe the regularity of data sets. Entropy functions are defined using multiple parameters, the selection of which is controversial and depends on the intended purpose. This study describes the results of tests conducted to support parameter selection, towards the goal of enabling further biomarker discovery. METHODS: This study deals with approximate, sample, fuzzy, and fuzzy measure entropies. All data were obtained from PhysioNet, a free-access, on-line archive of physiological signals, and represent various medical conditions. Five tests were defined and conducted to examine the influence of: varying the threshold value r (as multiples of the sample standard deviation σ, or the entropy-maximizing r(Chon)), the data length N, the weighting factors n for fuzzy and fuzzy measure entropies, and the thresholds r(F )and r(L )for fuzzy measure entropy. The results were tested for normality using Lilliefors' composite goodness-of-fit test. Consequently, the p-value was calculated with either a two sample t-test or a Wilcoxon rank sum test. RESULTS: The first test shows a cross-over of entropy values with regard to a change of r. Thus, a clear statement that a higher entropy corresponds to a high irregularity is not possible, but is rather an indicator of differences in regularity. N should be at least 200 data points for r = 0.2 σ and should even exceed a length of 1000 for r = r(Chon). The results for the weighting parameters n for the fuzzy membership function show different behavior when coupled with different r values, therefore the weighting parameters have been chosen independently for the different threshold values. The tests concerning r(F )and r(L )showed that there is no optimal choice, but r = r(F )= r(L )is reasonable with r = r(Chon )or r = 0.2σ. CONCLUSIONS: Some of the tests showed a dependency of the test significance on the data at hand. Nevertheless, as the medical conditions are unknown beforehand, compromises had to be made. Optimal parameter combinations are suggested for the methods considered. Yet, due to the high number of potential parameter combinations, further investigations of entropy for heart rate variability data will be necessary.
format Online
Article
Text
id pubmed-4140209
institution National Center for Biotechnology Information
language English
publishDate 2014
publisher BioMed Central
record_format MEDLINE/PubMed
spelling pubmed-41402092014-08-28 Selection of entropy-measure parameters for knowledge discovery in heart rate variability data Mayer, Christopher C Bachler, Martin Hörtenhuber, Matthias Stocker, Christof Holzinger, Andreas Wassertheurer, Siegfried BMC Bioinformatics Research BACKGROUND: Heart rate variability is the variation of the time interval between consecutive heartbeats. Entropy is a commonly used tool to describe the regularity of data sets. Entropy functions are defined using multiple parameters, the selection of which is controversial and depends on the intended purpose. This study describes the results of tests conducted to support parameter selection, towards the goal of enabling further biomarker discovery. METHODS: This study deals with approximate, sample, fuzzy, and fuzzy measure entropies. All data were obtained from PhysioNet, a free-access, on-line archive of physiological signals, and represent various medical conditions. Five tests were defined and conducted to examine the influence of: varying the threshold value r (as multiples of the sample standard deviation σ, or the entropy-maximizing r(Chon)), the data length N, the weighting factors n for fuzzy and fuzzy measure entropies, and the thresholds r(F )and r(L )for fuzzy measure entropy. The results were tested for normality using Lilliefors' composite goodness-of-fit test. Consequently, the p-value was calculated with either a two sample t-test or a Wilcoxon rank sum test. RESULTS: The first test shows a cross-over of entropy values with regard to a change of r. Thus, a clear statement that a higher entropy corresponds to a high irregularity is not possible, but is rather an indicator of differences in regularity. N should be at least 200 data points for r = 0.2 σ and should even exceed a length of 1000 for r = r(Chon). The results for the weighting parameters n for the fuzzy membership function show different behavior when coupled with different r values, therefore the weighting parameters have been chosen independently for the different threshold values. The tests concerning r(F )and r(L )showed that there is no optimal choice, but r = r(F )= r(L )is reasonable with r = r(Chon )or r = 0.2σ. CONCLUSIONS: Some of the tests showed a dependency of the test significance on the data at hand. Nevertheless, as the medical conditions are unknown beforehand, compromises had to be made. Optimal parameter combinations are suggested for the methods considered. Yet, due to the high number of potential parameter combinations, further investigations of entropy for heart rate variability data will be necessary. BioMed Central 2014-05-16 /pmc/articles/PMC4140209/ /pubmed/25078574 http://dx.doi.org/10.1186/1471-2105-15-S6-S2 Text en Copyright © 2014 Mayer et al.; licensee BioMed Central Ltd. http://creativecommons.org/licenses/by/2.0 This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.
spellingShingle Research
Mayer, Christopher C
Bachler, Martin
Hörtenhuber, Matthias
Stocker, Christof
Holzinger, Andreas
Wassertheurer, Siegfried
Selection of entropy-measure parameters for knowledge discovery in heart rate variability data
title Selection of entropy-measure parameters for knowledge discovery in heart rate variability data
title_full Selection of entropy-measure parameters for knowledge discovery in heart rate variability data
title_fullStr Selection of entropy-measure parameters for knowledge discovery in heart rate variability data
title_full_unstemmed Selection of entropy-measure parameters for knowledge discovery in heart rate variability data
title_short Selection of entropy-measure parameters for knowledge discovery in heart rate variability data
title_sort selection of entropy-measure parameters for knowledge discovery in heart rate variability data
topic Research
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4140209/
https://www.ncbi.nlm.nih.gov/pubmed/25078574
http://dx.doi.org/10.1186/1471-2105-15-S6-S2
work_keys_str_mv AT mayerchristopherc selectionofentropymeasureparametersforknowledgediscoveryinheartratevariabilitydata
AT bachlermartin selectionofentropymeasureparametersforknowledgediscoveryinheartratevariabilitydata
AT hortenhubermatthias selectionofentropymeasureparametersforknowledgediscoveryinheartratevariabilitydata
AT stockerchristof selectionofentropymeasureparametersforknowledgediscoveryinheartratevariabilitydata
AT holzingerandreas selectionofentropymeasureparametersforknowledgediscoveryinheartratevariabilitydata
AT wassertheurersiegfried selectionofentropymeasureparametersforknowledgediscoveryinheartratevariabilitydata