Cargando…
Minimum Divergence Estimators, Maximum Likelihood and the Generalized Bootstrap
This paper states that most commonly used minimum divergence estimators are MLEs for suited generalized bootstrapped sampling schemes. Optimality in the sense of Bahadur for associated tests of fit under such sampling is considered.
Autor principal: | Broniatowski, Michel |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7911913/ https://www.ncbi.nlm.nih.gov/pubmed/33572695 http://dx.doi.org/10.3390/e23020185 |
Ejemplares similares
-
Composite Likelihood Methods Based on Minimum Density Power Divergence Estimator
por: Castilla, Elena, et al.
Publicado: (2017) -
Branch length estimation and divergence dating: estimates of error in Bayesian and maximum likelihood frameworks
por: Schwartz, Rachel S, et al.
Publicado: (2010) -
Bootstrap, Bayesian probability and maximum likelihood mapping: exploring new tools for comparative genome analyses
por: Zhaxybayeva, Olga, et al.
Publicado: (2002) -
Maximum Penalized Likelihood Estimation
por: LaRiccia, Vincent N, et al.
Publicado: (2009) -
Maximum likelihood estimation of functional relationships
por: Nagelkerke, Nico J D
Publicado: (1992)