Conventional regression lets you take the values of independent variables and predict the mean value of a dependent variable. It thus provides a single equation. The relatively new method of "quantile regression" lets you take the values of independent variables and predict the quantile function of the independent variable. It thus provides a family of equations. It could be said that to determine the 0.35 quantile of the independent variable one uses the function , whereas to predict the 0.65 quantile of the independent variable one uses the function .

Just as conventional regression rests on the assumption that the distribution of errors around a prediction is normal, so too there is a normality assumption with quantile regression. More specifically, it is assumed that the error of the quantile functions is normally distributed.

This Demonstration lets you examine this normality assumption that lies behind quantile regression. You select an underlying distribution. You have the choice of picking a normal distribution, a gamma distribution, or a beta distribution. You also parameterize the underlying distribution and select the size of an underlying random sample drawn from this underlying distribution. You then choose derivative random samples from this underlying distribution. You determine both the size of each of these derivative random samples and the number of derivative random samples taken. You also choose the quantile value at which you want each of the derivative random samples evaluated. This produces a quantile sample.

This Demonstration takes the quantile sample, produces a histogram of it, determines the best-fit normal distribution for that quantile sample, the theoretically predicted versus the actual variance of the best-fit normal distribution, and the results of a test for the normality of the quantile sample. Mousing over the last row of the output produces additional information on the fit of the quantile distribution.