这篇文章主要介绍了Python如何实现线性回归算法,具有一定借鉴价值,感兴趣的朋友可以参考下,希望大家阅读完这篇文章之后大有收获,下面让小编带着大家一起了解一下。
用python实现线性回归
代码:
#encoding:utf-8
"""
Author: njulpy
Version: 1.0
Data: 2018/04/09
Project: Using Python to Implement LineRegression Algorithm
"""
import numpy as np
import pandas as pd
from numpy.linalg import inv
from numpy import dot
from sklearn.model_selection import train_test_split
import matplotlib.pyplot as plt
from sklearn import linear_model
# 最小二乘法
def lms(x_train,y_train,x_test):
theta_n = dot(dot(inv(dot(x_train.T, x_train)), x_train.T), y_train) # theta = (X'X)^(-1)X'Y
#print(theta_n)
y_pre = dot(x_test,theta_n)
mse = np.average((y_test-y_pre)**2)
#print(len(y_pre))
#print(mse)
return theta_n,y_pre,mse
#梯度下降算法
def train(x_train, y_train, num, alpha,m, n):
beta = np.ones(n)
for i in range(num):
h = np.dot(x_train, beta) # 计算预测值
error = h - y_train.T # 计算预测值与训练集的差值
delt = 2*alpha * np.dot(error, x_train)/m # 计算参数的梯度变化值
beta = beta - delt
#print('error', error)
return beta
if __name__ == "__main__":
iris = pd.read_csv('iris.csv')
iris['Bias'] = float(1)
x = iris[['Sepal.Width', 'Petal.Length', 'Petal.Width', 'Bias']]
y = iris['Sepal.Length']
x_train, x_test, y_train, y_test = train_test_split(x, y, test_size=0.2, random_state=5)
t = np.arange(len(x_test))
m, n = np.shape(x_train)
# Leastsquare
theta_n, y_pre, mse = lms(x_train, y_train, x_test)
# plt.plot(t, y_test, label='Test')
# plt.plot(t, y_pre, label='Predict')
# plt.show()
# GradientDescent
beta = train(x_train, y_train, 1000, 0.001, m, n)
y_predict = np.dot(x_test, beta.T)
# plt.plot(t, y_predict)
# plt.plot(t, y_test)
# plt.show()
# sklearn
regr = linear_model.LinearRegression()
regr.fit(x_train, y_train)
y_p = regr.predict(x_test)
print(regr.coef_,theta_n,beta)
l1,=plt.plot(t, y_predict)
l2,=plt.plot(t, y_p)
l3,=plt.plot(t, y_pre)
l4,=plt.plot(t, y_test)
plt.legend(handles=[l1, l2,l3,l4 ], labels=['GradientDescent', 'sklearn','Leastsquare','True'], loc='best')
plt.show()
输出结果
sklearn: [ 0.65368836 0.70955523 -0.54193454 0. ]
LeastSquare: [ 0.65368836 0.70955523 -0.54193454 1.84603897]
GradientDescent: [ 0.98359285 0.29325906 0.60084232 1.006859 ]
感谢你能够认真阅读完这篇文章,希望小编分享的“Python如何实现线性回归算法”这篇文章对大家有帮助,同时也希望大家多多支持亿速云,关注亿速云行业资讯频道,更多相关知识等着你来学习!
亿速云「云服务器」,即开即用、新一代英特尔至强铂金CPU、三副本存储NVMe SSD云盘,价格低至29元/月。点击查看>>
免责声明:本站发布的内容(图片、视频和文字)以原创、转载和分享为主,文章观点不代表本网站立场,如果涉及侵权请联系站长邮箱:is@yisu.com进行举报,并提供相关证据,一经查实,将立刻删除涉嫌侵权内容。