Linear Regression: The Normal Equation.

The normal equation:


What if I tell you that there is one equation, that gives you the correct thetas without implementing the gradient descent or the cost function. This equation is the normal equation.

It turns out that using calculus and some algebra you can derive the normal equation.
Image result for the normal equation machine learning

Getting an intuition will involve some linear algebra and calculus, so we skip over it. But it turns out that implementing the normal equation in python is a very easy task to do.


def useNormalEqn(self, x, y):
theta = (np.linalg.inv((x.T @ x))) @ x.T
theta = theta @ y
return theta


These two lines of code do exactly what the equation says. X's transpose is taken, multiplied with itself then the inverse of the result is taken using the np.linalf.inv function of numpy. Finally, the result is multiplied to 'y' to get the theta.

The complete code for linear regression can be found here: https://github.com/geekRishabhjain/MLpersonalLibrary/blob/master/RlearnPy/

No comments:

Post a Comment

Installing albert on ubuntu 19.04

Installing Albert on Ubuntu 19.04... Albert is not still released for ubuntu 19.04. But still, you can install it using the following ...