Lecture Note
University
Stanford UniversityCourse
CS229 | Machine LearningPages
2
Academic year
2023
anon
Views
30
Understanding Linear Regression and the Cost Function Linear Regression is a commonly used statistical technique that is used to model the linearrelationship between the independent and dependent variables. It can be used for predictionand forecasting of future data points, and it is widely used in the field of machine learning. In this article, we will take a deep dive into the linear regression model, the cost function, andthe gradient descent algorithm. By the end of this article, you will have a comprehensiveunderstanding of these concepts and how they work together to produce accuratepredictions. The Linear Regression Model Linear Regression is a simple and straightforward model that assumes a linear relationshipbetween the independent and dependent variables. The model is represented by theequation: 𝑦 = 𝑤𝑥 + 𝑏 where w is the weight or coefficient, b is the bias, x is the independent variable, and y is thedependent variable. The line's slope is represented by the weight w, and its intercept isrepresented by the bias b. Price Function The cost function is used to assess how well the linear regression model is working. It isdetermined using the mean squared error (MSE) formula and measures the discrepancybetween the expected and actual values: 𝐽 (𝑤,𝑏) = ( 12 𝑚) · 𝑖=1 𝑚 ∑ (𝑦 𝑖 − (𝑤𝑥 𝑖 + 𝑏)) 2 Where is the actual value, yi is the predicted value, and m is the total number of 𝑤𝑥 𝑖 + 𝑏 data points. The aim of the linear regression model is to give the most precise forecasts byminimizing the cost function. Algorithm for Gradient Descent The linear regression model is improved by minimizing the cost function using the gradientdescent approach. The algorithm iteratively adjusts the bias b and the weight w untilconvergence. The following formulas are used to create the updates: 𝑤 = 𝑤 − 𝑎 · ( 1 𝑚 ) · 𝑖=1 𝑚 ∑ (𝑦 𝑖 − (𝑤𝑥 𝑖 + 𝑏)) · 𝑥 𝑖 𝑏 = 𝑏 − 𝑎 · ( 1 𝑚 ) · 𝑖=1 𝑚 ∑ (𝑦 𝑖 − (𝑤𝑥 𝑖 + 𝑏))
Where is the learning rate, which controls how much of w and b are updated. The gradientdescent algorithm is an iterative procedure that keeps going until either a halting requirementis satisfied or the cost function achieves a low value.
Gradient Descent for Linear Regression
Please or to post comments