Regression, Big O Notation, and Gradient Descent — How Are They All Related?

Image Source

The Solution — Gradient Descent

Gradient descent is an iterative technique used to reduce model error when training a machine learning model. This is an optimization algorithm centered on a convex function that adjusts its parameters recursively to reduce a certain function towards its local minimum. Gradient descent is used in regression to obtain the values of model parameters (coefficients) that reduce, as much as possible, a cost function, such as root mean square error (RMSE). The key reason why gradient descent is used for linear regression is the reduction in computational complexity, in that, in several situations, finding the answer utilizing gradient descent is computationally cheaper.

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Acusio Bivona

Acusio Bivona

Fitness, Sports, Data — And not necessarily in that order