扣丁学堂Python培训分享Python实现的逻辑回归算法

本篇文章扣丁学堂Python培训主要给小伙伴们介绍Python实现的逻辑回归算法,,文中结合具体实例形式进行分析Python逻辑回归算法相关实现技巧,需要的朋友或者是对Python开发技术感兴趣的小伙伴可以参考一下。

扣丁学堂Python培训分享Python实现的逻辑回归算法示例

 

使用python实现逻辑回归

Using Python to Implement Logistic Regression Algorithm

代码:

#encoding:utf-8
"""
 Author: njulpy
 Version: 1.0
 Data: 2019/08/26
 Project: Using Python to Implement LogisticRegression Algorithm
"""
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
from sklearn.model_selection import train_test_split
#建立sigmoid函数
def sigmoid(x):
 x = x.astype(float)
 return 1./(1+np.exp(-x))
#训练模型,采用梯度下降算法
def train(x_train,y_train,num,alpha,m,n):
 beta = np.ones(n)
 for i in range(num):
 h=sigmoid(np.dot(x_train,beta)) #计算预测值
 error = h-y_train.T #计算预测值与训练集的差值
 delt=alpha*(np.dot(error,x_train))/m #计算参数的梯度变化值
 beta = beta - delt
 #print('error',error)
 return beta
def predict(x_test,beta):
 y_predict=np.zeros(len(y_test))+0.5
 s=sigmoid(np.dot(beta,x_test.T))
 y_predict[s < 0.34] = 0
 y_predict[s > 0.67] = 1
 return y_predict
def accurancy(y_predict,y_test):
 acc=1-np.sum(np.absolute(y_predict-y_test))/len(y_test)
 return acc
if __name__ == "__main__":
 data = pd.read_csv('iris.csv')
 x = data.iloc[:,1:5]
 y = data.iloc[:,5].copy()
 y.loc[y== 'setosa'] = 0
 y.loc[y== 'versicolor'] = 0.5
 y.loc[y== 'virginica'] = 1
 x_train,x_test,y_train,y_test = train_test_split(x,y,test_size=0.3,random_state=15)
 m,n=np.shape(x_train)
 alpha = 0.01
 beta=train(x_train,y_train,1000,alpha,m,n)
 pre=predict(x_test,beta)
 t = np.arange(len(x_test))
 plt.figure()
 p1 = plt.plot(t,pre)
 p2 = plt.plot(t,y_test,label='test')
 label = ['prediction', 'true']
 plt.legend(label, loc=1)
 plt.show()
 acc=accurancy(pre,y_test)
 print('The predicted value is ',pre)
 print('The true value is ',np.array(y_test))
 print('The accuracy rate is ',acc)

输出结果:

The predicted value is [ 0. 0.5 1. 0. 0. 1. 1. 0.5 1. 1. 1. 0.5 0.5 0.5 1.

0. 0.5 1. 0. 1. 0.5 0. 0.5 0.5 0. 0. 1. 1. 1. 1.

0. 1. 1. 1. 0. 0. 1. 0. 0. 0.5 1. 0. 0. 0.5 1. ]

The true value is [0 0.5 0.5 0 0 0.5 1 0.5 0.5 1 1 0.5 0.5 0.5 1 0 0.5 1 0 1 0.5 0 0.5 0.5 0

0 1 1 1 0.5 0 1 0.5 1 0 0 1 0 0 0.5 1 0 0 0.5 1]

The accuracy rate is 0.9444444444444444

 

扣丁学堂Python培训分享Python实现的逻辑回归算法示例

 

 

想要了解更多关于Python和人工智能方面内容的小伙伴,请关注扣丁学堂Python培训官网、微信等平台,扣丁学堂IT职业在线学习教育平台为您提供权威的Python开发环境搭建视频,Python培训后的前景无限,行业薪资和未来的发展会越来越好的,扣丁学堂老师精心推出的Python视频教程定能让你快速掌握Python从入门到精通开发实战技能。