codingstreets
Search
Close this search box.
machine-learning-polynomial-regression

Introduction to Machine Learning Polynomial Regression

In This Article, You Will Learn About Polynomial Regression.

Machine Learning Multiple Regression – Before moving ahead, let’s take a look at Introduction to Machine Learning Linear Regression.

Polynomial Regression

In the event that your points do not meet the requirements for a linear regression (a straight line across the entire data set) this could be suitable for a polynomial.

Like linear regression, relies on the relationship between variables x and y in order to determine the most effective way to create a line between all the points of data.

How does it work?

Python offers methods for determining the relationship between points of data and drawing lines in polynomial regression.

Example: Use the scatter() method to draw a scatter diagram.

				
					import matplotlib.pyplot as plt

x_axis = [10, 12, 18, 25, 33]
y_axis = [18, 29, 34, 39, 42]

plt.scatter(x_axis, y_axis)

plt.show()

				
			
machine-learning-polynomial-regression

As shown above, it returned a scatter plot from given data.

Example: Import numpy and Matplotlib to draw the line of Polynomial Regression.

				
					import numpy as np
import matplotlib.pyplot as plt

x_axis = [10, 12, 18, 25, 33]
y_axis = [18, 29, 34, 39, 42]

data = np.poly1d(np.polyfit(x_axis, y_axis, 9))

info = np.linspace(1, 30, 200)

plt.scatter(x_axis, y_axis)
plt.plot(info, data(info))

plt.show()

				
			

In lines 1 and 2,

The Matplotlib and SciPy module are imported to draw diagrams and lines.

In lines 4 and 5,

The variables x and y are defined as the data points that show points on the graph. 

In line 7,

NumPy has a method that to make a polynomial model:

In line 9,

Then specify how the line will display, we start at position 1, and end at position 30.

In line 11,

Draw the original scatter plot:

In line 12,

Draw the line of polynomial regression.

Note: Learn More about Numpy module at Numpy Tutorial.

Note: Learn More about SciPy module at SciPy Tutorial.

R-Squared

It is essential to understand the extent to which the relationship between the the x- and y-axis are in the absence of a connections, then the polynomial cannot be used for predicting anything.

This relationship can be measured using an amount known as the r-squared.

The r-squared values range between 0 and 1 where 0 indicates there is no relationship and 1 indicates 100% connected.

Python as well as the Sklearn module can compute the value for you. all you need to feed the program the two arrays (x and y).

Example: Check whether data fits in polynomial regression or not.

				
					import numpy as np
from sklearn.metrics import r2_score

x_axis = [10, 12, 18, 25, 33]
y_axis = [18, 29, 34, 39, 42]

data = np.poly1d(np.polyfit(x_axis, y_axis, 8))

print(r2_score(y_axis, data(x_axis)))

				
			
				
					Output - 

sys:1: RankWarning: Polyfit may be poorly conditioned
1.0
				
			

As a result, it returned a warning that clearly shows data is not fitted well.

Predict Future Values

Now, we can apply the data we’ve collected to anticipate future value.

Example: Predict the sale of product ‘x’.

				
					import numpy as np
from sklearn.metrics import r2_score

x_axis = [10, 12, 18, 25, 33]
y_axis = [18, 29, 34, 39, 42]

data = np.poly1d(np.polyfit(x_axis, y_axis, 8))
sale = data(5)

print(sale)

				
			
				
					Output - 

sys:1: RankWarning: Polyfit may be poorly conditioned
-38.04542208235844
				
			

Bad Result

Let’s create an example where Linear Regression cannot predict future value.

Example: Create a data that represents bad result for Polynomial Regression.

				
					import numpy as np
from sklearn.metrics import r2_score
import matplotlib.pyplot as plt

x_axis = [10, 12, 18, 25, 33]
y_axis = [18, 29, 34, 39, 42]

info = np.poly1d(np.polyfit(x_axis, y_axis, 9))

data = np.linspace(2, 30, 10)

plt.scatter(x_axis, y_axis)
plt.plot(data, info(data))
plt.show()


				
			
machine-learning-polynomial-regression

As a result, it returned Polynomial Plot that clearly shows data is not fitted well with line.

Example: Check whether ‘r’ returns low value or not.

				
					import numpy as np
from sklearn.metrics import r2_score

x_axis = [10, 12, 18, 25, 33]
y_axis = [18, 29, 34, 39, 42]

info = np.poly1d(np.polyfit(x_axis, y_axis, 3))

print(r2_score(y_axis, info(x_axis)))

				
			
				
					Output - 

0.9539428530144852
				
			

As a result, it returned ‘r’ value is very low.

If you find anything incorrect in the above-discussed topic and have any further questions, please comment below.

Connect on:

Recent Post

Popular Post

Top Articles

Archives
Categories

Share on