Cargando…

On Selection Criteria for the Tuning Parameter in Robust Divergence

Although robust divergence, such as density power divergence and [Formula: see text]-divergence, is helpful for robust statistical inference in the presence of outliers, the tuning parameter that controls the degree of robustness is chosen in a rule-of-thumb, which may lead to an inefficient inferen...

Descripción completa

Detalles Bibliográficos
Autores principales: Sugasawa, Shonosuke, Yonekura, Shouto
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8469821/
https://www.ncbi.nlm.nih.gov/pubmed/34573772
http://dx.doi.org/10.3390/e23091147
Descripción
Sumario:Although robust divergence, such as density power divergence and [Formula: see text]-divergence, is helpful for robust statistical inference in the presence of outliers, the tuning parameter that controls the degree of robustness is chosen in a rule-of-thumb, which may lead to an inefficient inference. We here propose a selection criterion based on an asymptotic approximation of the Hyvarinen score applied to an unnormalized model defined by robust divergence. The proposed selection criterion only requires first and second-order partial derivatives of an assumed density function with respect to observations, which can be easily computed regardless of the number of parameters. We demonstrate the usefulness of the proposed method via numerical studies using normal distributions and regularized linear regression.