Polynomial Regression Calculator
 What is polynomial regression?
 Polynomial regression definition
 What is the difference between linear and polynomial regression?
 How to find the polynomial regression coefficients?
 How to use this polynomial regression calculator?
 Matrix formula for polynomial regression
 System of linear equations for a polynomial regression model
 FAQ
So you find yourself needing to fit a polynomial model of regression to a dataset... Thankfully, Omni's polynomial regression calculator is here! With its help, you'll be able to quickly determine the polynomial that best fits your data.
If you're not yet familiar with this concept and want to learn what polynomial regression is, don't hesitate to read the article below. It not only explains the definition of the polynomial regression model and provides all the necessary math formulas for the polynomial regression but also explains in friendly terms the difference between linear and polynomial regression!
What is polynomial regression?
Regression is a statistical method that attempts to model the values of one variable (called the dependent variable) based on the values of other variable(s) (one or more, known as independent variable(s)). For instance, we may want to find the relationship between people's weight and their height and sex, or between salaries and work experience and level of education.
In the polynomial regression model, we assume that the relationship between the dependent variable and a single independent variable is described by a polynomial of some arbitrary degree.
If you've already encountered the model of simple linear regression, where the relationship between the dependent and independent variables is modeled by a straight line of best fit, then you've seen the simplest example of polynomial regression, that is, where the polynomial has degree one! Now, imagine some data that you can't fit a straight line too, yet a parabola would be perfect. Since we can keep increasing the degree of the curve, we see why the polynomial regression model is so useful!
Polynomial regression definition
We now know what polynomial regression is, so it's time we discuss in more detail the mathematical side of the polynomial regression model. Here and henceforth, we will denote by y
the dependent variable and by x
the independent variable.
The polynomial regression equation reads:
y = a_{0} + a_{1}x + a_{2}x^{2} + ... + a_{n}x^{n}
,
where a_{0}, a_{1}, ..., a_{n}
are called coefficients and n
is the degree of the polynomial regression model under consideration.
If you need a refresher on the topic of polynomials, check out Omni's calculators for:
The equation with an arbitrary degree n
might look a bit scary, but don't worry! In most reallife applications, we use polynomial regression of rather low degrees:

Degree 1:
y = a_{0} + a_{1}x
As we've already mentioned, this is simple linear regression, where we try to fit a straight line to the data points.

Degree 2:
y = a_{0} + a_{1}x + a_{2}x^{2}
Here we've got a quadratic regression, also known as secondorder polynomial regression, where we fit parabolas.

Degree 3:
y = a_{0} + a_{1}x + a_{2}x^{2} + a_{3}x^{3}
This is cubic regression, a.k.a. thirddegree polynomial regression, and here we deal with cubic functions, that is, curves of degree 3.
In the same vein, the polynomial regression model of degree n = 4
is called a quartic regression (or fourthorder polynomial regression), n = 5
is quintic regression, n = 6
is called sextic regression, and so on.
What is the difference between linear and polynomial regression?
In many books, you can find a remark that polynomial regression is an example of linear regression. At the same time and on the same page, you see the parabolas and cubic curves generated by polynomial regression. And then your head explodes because you can't wrap your head around all that. Why is polynomial regression linear if all the world can see that it models nonlinear relationships?
When we think of linear regression, we most often have in mind simple linear regression, which is the model where we fit a straight line to a dataset. We've already explained that simple linear regression is a particular case of polynomial regression, where we have polynomials of order 1.
However, when we talk about linear regression, what we have in mind is the family of regression models where the dependent variable is given by a function of the independent variable(s) and this function is linear in coefficients a_{0}, a_{1}, ... , a_{n}
. In other words, the model equation can contain all sorts of expressions like roots, logarithms, etc., and still be linear on the condition that all those crazy stuff is applied to the independent variable(s) and not to the coefficients. For instance, the following model is an example of linear regression:
y = a_{0}sin(x) + a_{1}ln(x) + a_{2}x^{17} + a_{3}√x
,
while this model is nonlinear:
y = a_{0} * x^{a1}
because the coefficient a_{1}
is in the exponent. To sum up, it doesn't matter what happens to x
. What matters is that nothing nonlinear happens to the coefficients: they are in first power, we don't multiply them by each other nor act on them with any functions like roots, logs, trigonometric functions, etc.
And so the mystery of why is polynomial regression linear? is solved. Now go and spread the happy news among your peers!
How to find the polynomial regression coefficients?
As always with regression, the main challenge is to determine the values of the coefficients a_{0}, a_{1}, ..., a_{n}
based on the values of the data sample (x_{1},y_{1}), ..., (x_{N},y_{N})
. To find the coefficients of the polynomial regression model, we usually resort to the leastsquares method, that is, we look for the values of a_{0}, a_{1}, ..., a_{n}
that minimize the sum of squared distances between each data point:
(x_{i}, y_{i})
,
and the corresponding point is predicted by the polynomial regression equation is:
(x_{i}, a_{0} + a_{1}x_{i} + ... + a_{n}x_{i}^{n})
.
In other words, we want to minimize the following function:
(a_{0}, a_{1}, ..., a_{n}) ↦ ∑_{i}(a_{0} + a_{1}x_{i} + ... + a_{n}x_{i}^{n}  y_{i})^{2}
,
where i
goes from 1
to N
, i.e., we sum over the whole data set. If you think it's not at all obvious how to solve this problem, you're absolutely right. A quick solution is, of course, to use Omni's polynomial regression calculator 😉 so we'll now discuss how to do it most efficiently. Then we will explain how to determine the coefficients in polynomial regression function by hand.
How to use this polynomial regression calculator?
Here's a short instruction on how to use our polynomial regression calculator:
 Enter your data: you can enter up to 30 data points (new rows will appear as you go). Remember that we need at least
n+1
points (both coordinates!) to fit a polynomial regression model of ordern
, and with exactlyn+1
points, the fit is always perfect!  The calculator will show you the scatter plot of your data along with the polynomial curve (of the degree you desired) fitted to your points.
 Below the scatter plot, you'll find the polynomial regression equation for your data.
 The coefficient of determination, R², measures how well the model fits your data points. It assumes values between
0
and1
, and the closer it is to1
, the better your polynomial regression model is.  You can go to the
Advanced mode
if you need the polynomial regression calculator to perform calculations with a higher precision.
Matrix formula for polynomial regression
Let's briefly discuss how to calculate the coefficients of polynomial regression by hand. First, let's discuss the projection matrix approach. Let us introduce some necessary notation:

Let
X
be the model matrix. This is a matrix withn+1
columns andN
rows, wheren
is the desired order of polynomial regression andN
is the number of data points, which we fill as follows: The first column we fill with ones.
 The second with the observed values
x_{1}, ..., x_{N}
of the independent variable.  The third with squares of these values.
 And so on...
 The last
n+1
th column with then
th powers of the observed values.
We end up with the following matrix:
⌈ 1 x_{1} x_{1}^{2} ... x_{1}^{n} ⌉  1 x_{2} x_{2}^{2} ... x_{2}^{n}   ... ... ... ... ...  ⌊ 1 x_{N} x_{N}^{2} ... x_{N}^{n} ⌋ 
Let
y
be a column vector filled with the valuesy_{1}, ..., y_{N}
of the dependent variable:⌈ y_{1} ⌉  y_{2}   ...  ⌊ y_{N} ⌋ 
Finally,
β
is the column of the coefficients of the polynomial regression model:⌈ a_{0} ⌉  a_{1}   ...  ⌊ a_{n} ⌋
Now, to determine the coefficients, we use the following matrix equation (the socalled normal equation):
β = (X^{T}X)^{1}X^{T}y
,
where:

X^{T}
 Transpose ofX
; 
(X^{T}X)^{1}
 Inverse ofX^{T}X
; and 
The operation between every two matrices is matrix multiplication.
⚠ For some very peculiar datasets, it may happen that the matrix X^{T}X
is singular, i.e., its inverse does not exist. In such a case, the polynomial regression cannot be computed.
The normal equation is the method that our polynomial regression calculator uses. If you'd rather solve systems of linear equations than perform a bunch of matrix operations, you may benefit from the alternative method, which we provide in the following final section.
System of linear equations for a polynomial regression model
The coefficients of a polynomial regression model satisfy the following system of n+1
linear equations:
You may use any method of solving systems of linear equations to deal with this system and work out the coefficients.
FAQ
Why is polynomial regression linear?
Polynomial regression is a particular case of linear regression model because its equation:
y = a_{0} + a_{1}x + a_{2}x^{2} + ... + a_{n}x^{n}
is linear as the function of the regression coefficients is a_{0}, a_{1}, ... , a_{n}
. However, polynomial regression can model all sorts of nonlinear relationships!
How many points do I need to fit polynomial regression?
The number of data points needed to determine the polynomial regression model depends on the degree of the polynomial you want to fit. For degree n
, you need at least n+1
data points. If you have exactly n+1
points, then the fit will be perfect, i.e., the curve will go through every point. Remember, the model is more reliable when you build it on a larger sample!
Can I always calculate polynomial regression?
No, it may happen that the polynomial regression cannot be fitted. However, this occurs only for very peculiar data sets, so you have a very low chance of ever facing this problem with actual reallife data.