The method of Ordinary Least Squares explained briefly

This method is attributed to Carl Friedrich Gauss. This method has very attractive statistical properties that have made it the most powerful and popular method of regression analysis. To understand this method, we first explain the OLS principle.

Two variable PRF

Yi=β_{1}+β_{2}Xi+µi

PRF is not directly observable. We estimate it from SRF:

Yi=β_{1}+β_{2}Xi+µi

Yi=Ŷi+ûi where Ŷi =β_{1}-β_{2}Xi

Ŷi is the estimated (conditional mean) value of Yi.

How is the SRF itself determined? To see this let us proceed as follows.

ûi+Ŷi-Ŷi

ûi=Yi-(β_{1}-β_{2}Xi) putting the value of Ŷi

**The method of Ordinary Least Squares**

This shows the ui is simply the difference between the actual value and estimated Y values.

Choose the SRF is such a way that the sum of the residents ∑ûi=∑(Yi-Ŷi) is as small as possible.

∑ûi=∑(Yi-Ŷi)

∑ûi^{2}=∑(Yi-Ŷi)^{2} squaring both sides

∑ûi^{2}=∑(Yi-β_{1}-β_{2}Xi)^{2}

Minimizing β_{1}, β_{2}

∑ûi^{2}=β(Yi-β_{1}-β_{2}Xi)^{2}

∂∑ûi^{2 }/ ∂β_{1}=2∑(Yi- β_{1}-β_{2}Xi)-1

=-2∑((Yi- β_{1}-β_{2}Xi)

=-2∑ûi where ûi=Yi- β_{1}-β_{2}Xi

∂∑ûi^{2 }/ ∂β_{2}=2∑(Yi- β_{1}-β_{2}Xi)-Xi

=-2∑((Yi- β_{1}-β_{2}Xi)Xi

=-2[∑((YiXi- β_{1}Xi-β_{2}Xi^{2})]

=-2∑ûiXi

= 0

**Related Articles:**

Classical Linear Regression Model

The method of Ordinary Least Squares

OLS Formulation Proving with equations

The OLS Estimators: Properties and formula