Regarding OLS estimator and Maximum likelihood estimator(MLE), Why some researchers use MLE to compare with OLS? and which one is the best method and why?
The methods of least squares and maximum likelihood are two different statistical procedures. The OLS is used to estimate the coefficients in a linear regression model by minimizing the sum of squares of the differences between fitted values and observed values regardless of the form of the distribution of the errors. Least squares produces best linear unbiased estimators of those coefficients. However, if the form of the distribution of the errors is known, the alternative of MLE can be used to estimate those coefficients. In other words, if you try to use MLE to estimate those parameters, then the form of the distribution of the random error needs to be assumed so that the likelihood function can be obtained. Of course, if the hypothesis testing and confidence interval construction are to be performed with OLS, then the form of the errors usually is assumed a normal distribution. That is why we have those t-test, chi-square test and F-test with OLS if the errors are normally distributed.
Author Details
I'M Ijaz Marwat Expert in Basic Economics and Policies implementation.
No comments:
Post a Comment