Cargando…

Multicollinearity and misleading statistical results

Multicollinearity represents a high degree of linear intercorrelation between explanatory variables in a multiple regression model and leads to incorrect results of regression analyses. Diagnostic tools of multicollinearity include the variance inflation factor (VIF), condition index and condition n...

Descripción completa

Detalles Bibliográficos
Autor principal: Kim, Jong Hae
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Korean Society of Anesthesiologists 2019
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6900425/
https://www.ncbi.nlm.nih.gov/pubmed/31304696
http://dx.doi.org/10.4097/kja.19087
Descripción
Sumario:Multicollinearity represents a high degree of linear intercorrelation between explanatory variables in a multiple regression model and leads to incorrect results of regression analyses. Diagnostic tools of multicollinearity include the variance inflation factor (VIF), condition index and condition number, and variance decomposition proportion (VDP). The multicollinearity can be expressed by the coefficient of determination (R(h)(2)) of a multiple regression model with one explanatory variable (X(h)) as the model’s response variable and the others (X(i) [i≠h] as its explanatory variables. The variance (σ(h)(2)) of the regression coefficients constituting the final regression model are proportional to the VIF [Formula: see text]. Hence, an increase in R(h)(2) (strong multicollinearity) increases σ(h)(2). The larger σ(h)(2) produces unreliable probability values and confidence intervals of the regression coefficients. The square root of the ratio of the maximum eigenvalue to each eigenvalue from the correlation matrix of standardized explanatory variables is referred to as the condition index. The condition number is the maximum condition index. Multicollinearity is present when the VIF is higher than 5 to 10 or the condition indices are higher than 10 to 30. However, they cannot indicate multicollinear explanatory variables. VDPs obtained from the eigenvectors can identify the multicollinear variables by showing the extent of the inflation of σ(h)(2) according to each condition index. When two or more VDPs, which correspond to a common condition index higher than 10 to 30, are higher than 0.8 to 0.9, their associated explanatory variables are multicollinear. Excluding multicollinear explanatory variables leads to statistically stable multiple regression models.