Heteroscedasticity in the Logit Model

The purpose of these notes is to provide an example of heteroscedasticity and to explain how we can account for the unequal variances of our residuals in a weighted least squares model. The specific example of heteroscedasticity that we will use is an analysis of proportions data in a logit model.

Suppose that we observe our dependent variable as a proportion or percentage:

0pi1

which reflects an underlying "true" probability:

0πi1

If we used the raw percentage as our dependent variable, then our predictions of the dependent variable would not be bounded by zero and one. Instead, it is possible that some of our predictions of the dependent variable would be less than zero and some would be greater than one.

Converting the raw percentage to log odds gives us an unbounded dependent variable:

ln(pi1-pi)=α+βXi+ϵi

but the residuals in such a logit model do not have constant variance:

var(ϵi)=1niπi(1-πi)

Instead the residual variance will be larger when the true probability is closer to zero or one.

Assuming that our explanatory variable is correlated with the true probability, the residual variance will be larger at extreme values of our explanatory variable.

Because we know the nature of the heteroscedasticity, we can modify our regression model to account for it. All we need is an unbiased estimate of the true probability for each observation.

OLS will provide such an unbiased estimate if there is no correlation between the residual and the explanatory variable (i.e. if the explanatory variable only affects the residual variance, not the residual itself).

So our first step is to estimate the regression coefficients with OLS:

ln(pi1-pi)=α+βXi+ϵi

and then use the estimated coefficients to predict the true log odds:

ln(πi^1-πi^)=α^+β^Xi

But what we really want is an unbiased prediction of the true probability:

πi^=11+exp(-α-β^Xi)

which we can use to weight each observation:

wi=niπi^(1-πi^)

And in the second step, we apply the weight to each variable and estimate the regression coefficients:

wiln(pi1-pi)=αwi+βwiXi+wiϵi

Weighting each observation in such a manner accounts for the heteroscedasticity and the residuals in our regression model should now have constant variance.


<< back to the main page

Copyright © 2002-2024 Eryk Wdowiak