Simple Linear Regression
• Taking dependent variable and independent variable , simple linear model for population is given bellow:
= � + �
• The model obtained from statistic sample is ො = � + �
• This model looks like line equation: = � + �, � is gradient, � is the constant
• Parameter � , � are estimated by � , � which are optimal estimator. These estimator are obtained by:
• OLS (Ordinary Least Square) method
Gradient Descent
• Gradient descent is an optimization algorithm used to find the values of parameters (coefficients) of a function (f) that minimizes a cost function (cost).
• The cost function:
����� �,� = � 1
�= �
� − � � + �
• To run gradient descent on this error function, we first need to compute its gradient. The gradient will act like a compass and always point us downhill. To compute it, we will need to differentiate our error function.
https://medium.com/meta-design- ideas/linear-regression-by-using-gradient-
…cont’d
…cont’d
• The learning Rate variable controls how large of a step we take downhill during each iteration.
• If we take too large of a step, we may step over the minimum.