# Linear Regression with Gradient Descent

This Demonstration shows how linear regression can determine the best fit to a collection of points by iteratively applying gradient descent. Linear regression works by minimizing the error function:  , where is the number of points. Because it is not always possible to solve for the minimum of this function, gradient descent is used. Gradient descent consists of iteratively subtracting from a starting value the slope at point times a constant called the learning rate. You can vary the iterations into gradient descent, the number of points in the dataset, the seed for randomly generating the points and the learning rate.

### PERMANENT CITATION

 Share: Embed Interactive Demonstration New! Just copy and paste this snippet of JavaScript code into your website or blog to put the live Demonstration on your site. More details » Download Demonstration as CDF » Download Author Code »(preview ») Files require Wolfram CDF Player or Mathematica.

• Linear and Quadratic Curve Fitting PracticeJon McLoone
• Cumulative Frequency CurvesPhil Ramsden
• Mean, Fitted-Value, Error, and Residual in Simple Linear RegressionIan McLeod
• Image Compression via the Singular Value DecompositionChris Maes
• Auto-Regressive Simulation (Second-Order)David von Seggern (University of Nevada)
• Solving a Linear System with Uncertain CoefficientsValter Yoshihiko Aibe and Mikhail Dimitrov Mikhailov
• Numerical Instability in the Gram-Schmidt AlgorithmChris Boucher
• Using Sampled Data to Estimate Derivatives, Integrals, and Interpolated ValuesRobert L. Brown
• Newton's Method on a Mesh of Initial GuessesKen Levasseur
• Finite Difference Approximations of the First Derivative of a FunctionVincent Shatlock and Autar Kaw