Cargando…
Bias-variance decomposition of absolute errors for diagnosing regression models of continuous data
Bias-variance decomposition (BVD) is a powerful tool for understanding and improving data-driven models. It reveals sources of estimation errors. Existing literature has defined BVD for squared error but not absolute error, while absolute error is the more natural error metric and has shown advantag...
Autor principal: | |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Elsevier
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8369249/ https://www.ncbi.nlm.nih.gov/pubmed/34430928 http://dx.doi.org/10.1016/j.patter.2021.100309 |
Sumario: | Bias-variance decomposition (BVD) is a powerful tool for understanding and improving data-driven models. It reveals sources of estimation errors. Existing literature has defined BVD for squared error but not absolute error, while absolute error is the more natural error metric and has shown advantages over squared error in many scientific fields. Here, I analytically derive the absolute-error BVD, empirically investigate its behaviors, and compare that with other error metrics. Different error metrics offer distinctly different perspectives. I find the commonly believed bias/variance trade-off under squared error is often absent under absolute error, and ensembles—a never hurt technique under squared error—could harm performance under absolute error. Compared with squared error, absolute-error BVD better promotes model traits reducing estimation residuals and better illustrates relative importance of different error sources. As data scientists pay increasing attention to uncertainty issues, the technique introduced here can be a useful addition to a data-driven modeler's toolset. |
---|