# What is quadratic least square regression and how can I use partial derivatives to deduce the equations for the least square regression? I looked it up online and a lot of people uses their...

What is quadratic least square regression and how can I use partial derivatives to deduce the equations for the least square regression? I looked it up online and a lot of people uses their calculator to deduce the equation - is there a way for me to do it algebraically/using calculus?

*print*Print*list*Cite

Quadratic least square regression refers to approximating any function f(x) by a quadratic polynomial ax^2+bx+c such that the square of the error determined using known points `(f(x), x)` is the least.

If `(y_a,x_a)` refers to points that lie on the curve representing g(x), the square of the error is `(y_n - (ax_n^2 + b*x_n + c))^2`

If `(y_n, x_n)` is known for N points, the sum of the square of the error is `S = sum_(n = 1)^N (y_n - (ax_n^2 + b*x_n + c))^2` . The values a, b and c can be determined by taking the partial derivative with respect to a, b and c and equating each of the values to 0. The resulting equations can be solved for the variables a, b and c.

`(del S)/(del a) = 2*sum_(n=1)^N (y_n - (ax_n^2 + b*x_n + c))*(-x_n^2) = 0`

=> `sum_(n=1)^N(y_n*x_n^2) = a*sum_(n=1)^N x_n^4 + b*sum_(n=1)^N x_n^3 + c*sum_(n=1)^N x_n^2`

Similarly, the partial derivative with respect to b gives:

`sum_(n=1)^N(y_n*x_n) = a*sum_(n=1)^N x_n^3 + b*sum_(n=1)^N x_n^2 + c*sum_(n=1)^N x_n`

Finally, taking the partial derivative with respect to c gives:

`sum_(n=1)^N(y_n) = a*sum_(n=1)^N x_n^2 + b*sum_(n=1)^N x_n + c*sum_(n=1)^N 1`

**These three equations are solved to determine the quadratic function ax^2 + bx + c that approximates the function f(x).**