Practice fitting lines and curves to sample datasets, then compare your fit to the best possible.

You can choose to fit straight lines or quadratic curves to the data, and can choose data based on an underlying straight line or quadratic data.

Drag the locators to move the line and attempt to get the line to be the best fit possible for the random data. For extra help show the error squares (the square of the axis distance between the points and your line) and try to minimize these. For further help show the sum of error squares and try to minimize that.

When you think you have the best fit, show the actual best fit and see how close you were. Then experiment with different datasets, models, and error sizes.

The best fit shown automatically minimizes the sum of squares differences between the data and fitted values. For (assumed) normally distributed errors, it can be generally assumed that least squares is the best fit possible. For other distributions, least squares will for instance not be the maximum likelihood estimate (which is commonly the criterion for best fit).