Cargando…
Generalization of Entropy Based Divergence Measures for Symbolic Sequence Analysis
Entropy based measures have been frequently used in symbolic sequence analysis. A symmetrized and smoothed form of Kullback-Leibler divergence or relative entropy, the Jensen-Shannon divergence (JSD), is of particular interest because of its sharing properties with families of other divergence measu...
Autores principales: | Ré, Miguel A., Azad, Rajeev K. |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Public Library of Science
2014
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3984095/ https://www.ncbi.nlm.nih.gov/pubmed/24728338 http://dx.doi.org/10.1371/journal.pone.0093532 |
Ejemplares similares
-
Symbolic Entropy Analysis and Its Applications
por: Alcaraz, Raúl
Publicado: (2018) -
The Lorenz Curve: A Proper Framework to Define Satisfactory Measures of Symbol Dominance, Symbol Diversity, and Information Entropy
por: Camargo, Julio A.
Publicado: (2020) -
Electroencephalogram–Electromyography Coupling Analysis in Stroke Based on Symbolic Transfer Entropy
por: Gao, Yunyuan, et al.
Publicado: (2018) -
Extreme Interval Entropy Based on Symbolic Analysis and a Self-Adaptive Method
por: Xu, Zhuofei, et al.
Publicado: (2019) -
Measuring the Coupling Direction between Neural Oscillations with Weighted Symbolic Transfer Entropy
por: Li, Zhaohui, et al.
Publicado: (2020)