Cargando…
Complex temporal topic evolution modelling using the Kullback-Leibler divergence and the Bhattacharyya distance
The rapidly expanding corpus of medical research literature presents major challenges in the understanding of previous work, the extraction of maximum information from collected data, and the identification of promising research directions. We present a case for the use of advanced machine learning...
Autores principales: | Andrei, Victor, Arandjelović, Ognjen |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Springer International Publishing
2016
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5042987/ https://www.ncbi.nlm.nih.gov/pubmed/27746813 http://dx.doi.org/10.1186/s13637-016-0050-0 |
Ejemplares similares
-
Kullback–Leibler divergence and the Pareto–Exponential approximation
por: Weinberg, G. V.
Publicado: (2016) -
Computation of Kullback–Leibler Divergence in Bayesian Networks
por: Moral, Serafín, et al.
Publicado: (2021) -
Kullback Leibler divergence in complete bacterial and phage genomes
por: Akhter, Sajia, et al.
Publicado: (2017) -
Minimising the Kullback–Leibler Divergence for Model Selection in Distributed Nonlinear Systems
por: Cliff, Oliver M., et al.
Publicado: (2018) -
Kullback–Leibler Divergence of a Freely Cooling Granular Gas
por: Megías, Alberto, et al.
Publicado: (2020)