一、二元输入特征线性回归

测试数据为:ex1data2.txt

  1. ,,
  2. ,,
  3. ,,
  4. ,,
  5. ,,
  6. ,,
  7. ,,
  8. ,,
  9. ,,
  10. ,,
  11. ,,
  12. ,,
  13. ,,
  14. ,,
  15. ,,
  16. ,,
  17. ,,
  18. ,,
  19. ,,
  20. ,,
  21. ,,
  22. ,,
  23. ,,
  24. ,,
  25. ,,
  26. ,,
  27. ,,
  28. ,,
  29. ,,
  30. ,,
  31. ,,
  32. ,,
  33. ,,
  34. ,,
  35. ,,
  36. ,,
  37. ,,
  38. ,,
  39. ,,
  40. ,,
  41. ,,
  42. ,,
  43. ,,
  44. ,,
  45. ,,
  46. ,,
  47. ,,

Python代码如下:

  1. #-*- coding: UTF- -*-
  2.  
  3. import random
  4. import numpy as np
  5. import matplotlib.pyplot as plt
  6.  
  7. #加载数据
  8. def load_exdata(filename):
  9. data = []
  10. with open(filename, 'r') as f:
  11. for line in f.readlines():
  12. line = line.split(',')
  13. current = [int(item) for item in line] //根据数据输入的不同确定是int 还是其他类型
  14. #5.5277,9.1302
  15. data.append(current)
  16. return data
  17.  
  18. data = load_exdata('ex1data2.txt');
  19. data = np.array(data,np.int64)//根据数据输入的不同确定是int 还是其他类型
  20.  
  21. #特征缩放
  22. def featureNormalize(X):
  23. X_norm = X;
  24. mu = np.zeros((,X.shape[]))
  25. sigma = np.zeros((,X.shape[]))
  26. for i in range(X.shape[]):
  27. mu[,i] = np.mean(X[:,i]) # 均值
  28. sigma[,i] = np.std(X[:,i]) # 标准差
  29. # print(mu)
  30. # print(sigma)
  31. X_norm = (X - mu) / sigma
  32. return X_norm,mu,sigma
  33.  
  34. #计算损失
  35. def computeCost(X, y, theta):
  36. m = y.shape[]
  37. # J = (np.sum((X.dot(theta) - y)**)) / (*m)
  38. C = X.dot(theta) - y
  39. J2 = (C.T.dot(C))/ (*m)
  40. return J2
  41.  
  42. #梯度下降
  43. def gradientDescent(X, y, theta, alpha, num_iters):
  44. m = y.shape[]
  45. #print(m)
  46. # 存储历史误差
  47. J_history = np.zeros((num_iters, ))
  48. for iter in range(num_iters):
  49. # 对J求导,得到 alpha/m * (WX - Y)*x(i), (,m)*(m,) X (m,)*(,) = (m,)
  50. theta = theta - (alpha/m) * (X.T.dot(X.dot(theta) - y))
  51. J_history[iter] = computeCost(X, y, theta)
  52. return J_history,theta
  53.  
  54. iterations = #迭代次数
  55. alpha = 0.01 #学习率
  56. x = data[:,(,)].reshape((-,))
  57. y = data[:,].reshape((-,))
  58. m = y.shape[]
  59. x,mu,sigma = featureNormalize(x)
  60. X = np.hstack([x,np.ones((x.shape[], ))])
  61. # X = X[range(),:]
  62. # y = y[range(),:]
  63.  
  64. theta = np.zeros((, ))
  65.  
  66. j = computeCost(X,y,theta)
  67. J_history,theta = gradientDescent(X, y, theta, alpha, iterations)
  68.  
  69. print('Theta found by gradient descent',theta)
  70.  
  71. def predict(data):
  72. testx = np.array(data)
  73. testx = ((testx - mu) / sigma)
  74. testx = np.hstack([testx,np.ones((testx.shape[], ))])
  75. price = testx.dot(theta)
  76. print('price is %d ' % (price))
  77.  
  78. predict([,])

二、多元线性回归,以三个特征输入为例

输入数据:testdata.txt。其中第一列是指输入的数据序列,不可读入

  1. ,230.1,37.8,69.2,22.1
  2. ,44.5,39.3,45.1,10.4
  3. ,17.2,45.9,69.3,9.3
  4. ,151.5,41.3,58.5,18.5
  5. ,180.8,10.8,58.4,12.9
  6. ,8.7,48.9,,7.2
  7. ,57.5,32.8,23.5,11.8
  8. ,120.2,19.6,11.6,13.2
  9. ,8.6,2.1,,4.8
  10. ,199.8,2.6,21.2,10.6
  11. ,66.1,5.8,24.2,8.6
  12. ,214.7,,,17.4
  13. ,23.8,35.1,65.9,9.2
  14. ,97.5,7.6,7.2,9.7
  15. ,204.1,32.9,,
  16. ,195.4,47.7,52.9,22.4
  17. ,67.8,36.6,,12.5
  18. ,281.4,39.6,55.8,24.4
  19. ,69.2,20.5,18.3,11.3
  20. ,147.3,23.9,19.1,14.6
  21. ,218.4,27.7,53.4,
  22. ,237.4,5.1,23.5,12.5
  23. ,13.2,15.9,49.6,5.6
  24. ,228.3,16.9,26.2,15.5
  25. ,62.3,12.6,18.3,9.7
  26. ,262.9,3.5,19.5,
  27. ,142.9,29.3,12.6,
  28. ,240.1,16.7,22.9,15.9
  29. ,248.8,27.1,22.9,18.9
  30. ,70.6,,40.8,10.5
  31. ,292.9,28.3,43.2,21.4
  32. ,112.9,17.4,38.6,11.9
  33. ,97.2,1.5,,9.6
  34. ,265.6,,0.3,17.4
  35. ,95.7,1.4,7.4,9.5
  36. ,290.7,4.1,8.5,12.8
  37. ,266.9,43.8,,25.4
  38. ,74.7,49.4,45.7,14.7
  39. ,43.1,26.7,35.1,10.1
  40. ,,37.7,,21.5
  41. ,202.5,22.3,31.6,16.6
  42. ,,33.4,38.7,17.1
  43. ,293.6,27.7,1.8,20.7
  44. ,206.9,8.4,26.4,12.9
  45. ,25.1,25.7,43.3,8.5
  46. ,175.1,22.5,31.5,14.9
  47. ,89.7,9.9,35.7,10.6
  48. ,239.9,41.5,18.5,23.2
  49. ,227.2,15.8,49.9,14.8
  50. ,66.9,11.7,36.8,9.7
  51. ,199.8,3.1,34.6,11.4
  52. ,100.4,9.6,3.6,10.7
  53. ,216.4,41.7,39.6,22.6
  54. ,182.6,46.2,58.7,21.2
  55. ,262.7,28.8,15.9,20.2
  56. ,198.9,49.4,,23.7
  57. ,7.3,28.1,41.4,5.5
  58. ,136.2,19.2,16.6,13.2
  59. ,210.8,49.6,37.7,23.8
  60. ,210.7,29.5,9.3,18.4
  61. ,53.5,,21.4,8.1
  62. ,261.3,42.7,54.7,24.2
  63. ,239.3,15.5,27.3,15.7
  64. ,102.7,29.6,8.4,
  65. ,131.1,42.8,28.9,
  66. ,,9.3,0.9,9.3
  67. ,31.5,24.6,2.2,9.5
  68. ,139.3,14.5,10.2,13.4
  69. ,237.4,27.5,,18.9
  70. ,216.8,43.9,27.2,22.3
  71. ,199.1,30.6,38.7,18.3
  72. ,109.8,14.3,31.7,12.4
  73. ,26.8,,19.3,8.8
  74. ,129.4,5.7,31.3,
  75. ,213.4,24.6,13.1,
  76. ,16.9,43.7,89.4,8.7
  77. ,27.5,1.6,20.7,6.9
  78. ,120.5,28.5,14.2,14.2
  79. ,5.4,29.9,9.4,5.3
  80. ,,7.7,23.1,
  81. ,76.4,26.7,22.3,11.8
  82. ,239.8,4.1,36.9,12.3
  83. ,75.3,20.3,32.5,11.3
  84. ,68.4,44.5,35.6,13.6
  85. ,213.5,,33.8,21.7
  86. ,193.2,18.4,65.7,15.2
  87. ,76.3,27.5,,
  88. ,110.7,40.6,63.2,
  89. ,88.3,25.5,73.4,12.9
  90. ,109.8,47.8,51.4,16.7
  91. ,134.3,4.9,9.3,11.2
  92. ,28.6,1.5,,7.3
  93. ,217.7,33.5,,19.4
  94. ,250.9,36.5,72.3,22.2
  95. ,107.4,,10.9,11.5
  96. ,163.3,31.6,52.9,16.9
  97. ,197.6,3.5,5.9,11.7
  98. ,184.9,,,15.5
  99. ,289.7,42.3,51.2,25.4
  100. ,135.2,41.7,45.9,17.2
  101. ,222.4,4.3,49.8,11.7
  102. ,296.4,36.3,100.9,23.8
  103. ,280.2,10.1,21.4,14.8
  104. ,187.9,17.2,17.9,14.7
  105. ,238.2,34.3,5.3,20.7
  106. ,137.9,46.4,,19.2
  107. ,,,29.7,7.2
  108. ,90.4,0.3,23.2,8.7
  109. ,13.1,0.4,25.6,5.3
  110. ,255.4,26.9,5.5,19.8
  111. ,225.8,8.2,56.5,13.4
  112. ,241.7,,23.2,21.8
  113. ,175.7,15.4,2.4,14.1
  114. ,209.6,20.6,10.7,15.9
  115. ,78.2,46.8,34.5,14.6
  116. ,75.1,,52.7,12.6
  117. ,139.2,14.3,25.6,12.2
  118. ,76.4,0.8,14.8,9.4
  119. ,125.7,36.9,79.2,15.9
  120. ,19.4,,22.3,6.6
  121. ,141.3,26.8,46.2,15.5
  122. ,18.8,21.7,50.4,
  123. ,,2.4,15.6,11.6
  124. ,123.1,34.6,12.4,15.2
  125. ,229.5,32.3,74.2,19.7
  126. ,87.2,11.8,25.9,10.6
  127. ,7.8,38.9,50.6,6.6
  128. ,80.2,,9.2,8.8
  129. ,220.3,,3.2,24.7
  130. ,59.6,,43.1,9.7
  131. ,0.7,39.6,8.7,1.6
  132. ,265.2,2.9,,12.7
  133. ,8.4,27.2,2.1,5.7
  134. ,219.8,33.5,45.1,19.6
  135. ,36.9,38.6,65.6,10.8
  136. ,48.3,,8.5,11.6
  137. ,25.6,,9.3,9.5
  138. ,273.7,28.9,59.7,20.8
  139. ,,25.9,20.5,9.6
  140. ,184.9,43.9,1.7,20.7
  141. ,73.4,,12.9,10.9
  142. ,193.7,35.4,75.6,19.2
  143. ,220.5,33.2,37.9,20.1
  144. ,104.6,5.7,34.4,10.4
  145. ,96.2,14.8,38.9,11.4
  146. ,140.3,1.9,,10.3
  147. ,240.1,7.3,8.7,13.2
  148. ,243.2,,44.3,25.4
  149. ,,40.3,11.9,10.9
  150. ,44.7,25.8,20.6,10.1
  151. ,280.7,13.9,,16.1
  152. ,,8.4,48.7,11.6
  153. ,197.6,23.3,14.2,16.6
  154. ,171.3,39.7,37.7,
  155. ,187.8,21.1,9.5,15.6
  156. ,4.1,11.6,5.7,3.2
  157. ,93.9,43.5,50.5,15.3
  158. ,149.8,1.3,24.3,10.1
  159. ,11.7,36.9,45.2,7.3
  160. ,131.7,18.4,34.6,12.9
  161. ,172.5,18.1,30.7,14.4
  162. ,85.7,35.8,49.3,13.3
  163. ,188.4,18.1,25.6,14.9
  164. ,163.5,36.8,7.4,
  165. ,117.2,14.7,5.4,11.9
  166. ,234.5,3.4,84.8,11.9
  167. ,17.9,37.6,21.6,
  168. ,206.8,5.2,19.4,12.2
  169. ,215.4,23.6,57.6,17.1
  170. ,284.3,10.6,6.4,
  171. ,,11.6,18.4,8.4
  172. ,164.5,20.9,47.4,14.5
  173. ,19.6,20.1,,7.6
  174. ,168.4,7.1,12.8,11.7
  175. ,222.4,3.4,13.1,11.5
  176. ,276.9,48.9,41.8,
  177. ,248.4,30.2,20.3,20.2
  178. ,170.2,7.8,35.2,11.7
  179. ,276.7,2.3,23.7,11.8
  180. ,165.6,,17.6,12.6
  181. ,156.6,2.6,8.3,10.5
  182. ,218.5,5.4,27.4,12.2
  183. ,56.2,5.7,29.7,8.7
  184. ,287.6,,71.8,26.2
  185. ,253.8,21.3,,17.6
  186. ,,45.1,19.6,22.6
  187. ,139.5,2.1,26.6,10.3
  188. ,191.1,28.7,18.2,17.3
  189. ,,13.9,3.7,15.9
  190. ,18.7,12.1,23.4,6.7
  191. ,39.5,41.1,5.8,10.8
  192. ,75.5,10.8,,9.9
  193. ,17.2,4.1,31.6,5.9
  194. ,166.8,,3.6,19.6
  195. ,149.7,35.6,,17.3
  196. ,38.2,3.7,13.8,7.6
  197. ,94.2,4.9,8.1,9.7
  198. ,,9.3,6.4,12.8
  199. ,283.6,,66.2,25.5
  200. ,232.1,8.6,8.7,13.4

python 代码:

  1. #-*- coding: UTF- -*-
  2.  
  3. import random
  4. import numpy as np
  5. import matplotlib.pyplot as plt
  6.  
  7. #加载数据
  8. def load_exdata(filename):
  9. data = []
  10. with open(filename, 'r') as f:
  11. for line in f.readlines():
  12. line = line.split(',')
  13. current = [float(item) for item in line]
  14. #5.5277,9.1302
  15. data.append(current)
  16. return data
  17.  
  18. data = load_exdata('testdata.txt');
  19. data = np.array(data,np.float64)//数据是浮点型
  20.  
  21. # 特征缩放
  22. def featureNormalize(X):
  23. X_norm = X;
  24. mu = np.zeros((, X.shape[]))
  25. sigma = np.zeros((, X.shape[]))
  26. for i in range(X.shape[]):
  27. mu[, i] = np.mean(X[:, i]) # 均值
  28. sigma[, i] = np.std(X[:, i]) # 标准差
  29. # print(mu)
  30. # print(sigma)
  31. X_norm = (X - mu) / sigma
  32. return X_norm, mu, sigma
  33.  
  34. # 计算损失
  35. def computeCost(X, y, theta):
  36. m = y.shape[]
  37. # J = (np.sum((X.dot(theta) - y)**)) / (*m)
  38. C = X.dot(theta) - y
  39. J2 = (C.T.dot(C)) / ( * m)
  40. return J2
  41.  
  42. # 梯度下降
  43. def gradientDescent(X, y, theta, alpha, num_iters):
  44. m = y.shape[]
  45. # print(m)
  46. # 存储历史误差
  47. J_history = np.zeros((num_iters, ))
  48. for iter in range(num_iters):
  49. # 对J求导,得到 alpha/m * (WX - Y)*x(i), (,m)*(m,) X (m,)*(,) = (m,)
  50. theta = theta - (alpha / m) * (X.T.dot(X.dot(theta) - y))
  51. J_history[iter] = computeCost(X, y, theta)
  52. return J_history, theta
  53.  
  54. iterations = # 迭代次数
  55. alpha = 0.01 # 学习率
  56. x = data[:, ( ,,)].reshape((-, ))//数据特征输入,采用数据集一行的,第1,2,3个数据,然后将其变成一行,所以用shape
  57. y = data[:, ].reshape((-, ))//输出特征,数据集的第四位
  58. m = y.shape[]
  59. x, mu, sigma = featureNormalize(x)
  60. X = np.hstack([x, np.ones((x.shape[], ))])
  61. # X = X[range(),:]
  62. # y = y[range(),:]
  63.  
  64. theta = np.zeros((, ))//因为x+y.总共有四个输入,所以theta是四维
  65.  
  66. j = computeCost(X, y, theta)
  67. J_history, theta = gradientDescent(X, y, theta, alpha, iterations)
  68.  
  69. print('Theta found by gradient descent', theta)
  70.  
  71. def predict(data):
  72. testx = np.array(data)
  73. testx = ((testx - mu) / sigma)
  74. testx = np.hstack([testx, np.ones((testx.shape[], ))])
  75. price = testx.dot(theta)
  76. print('predit value is %f ' % (price))
  77.  
  78. predict([151.5,41.3,58.5])//输入为3维
  79.  

Python 实现多元线性回归预测的更多相关文章

  1. MATLAB实现多元线性回归预测

    一.简单的多元线性回归: data.txt ,230.1,37.8,69.2,22.1 ,44.5,39.3,45.1,10.4 ,17.2,45.9,69.3,9.3 ,151.5,41.3,58. ...

  2. 机器学习01:使用scikit-learn的线性回归预测Google股票

    这是机器学习系列的第一篇文章. 本文将使用Python及scikit-learn的线性回归预测Google的股票走势.请千万别期望这个示例能够让你成为股票高手.下面按逐步介绍如何进行实践. 准备数据 ...

  3. R语言 多元线性回归分析

    #线性模型中有关函数#基本函数 a<-lm(模型公式,数据源) #anova(a)计算方差分析表#coef(a)提取模型系数#devinace(a)计算残差平方和#formula(a)提取模型公 ...

  4. R与数据分析旧笔记(六)多元线性分析 下

    逐步回归 向前引入法:从一元回归开始,逐步加快变量,使指标值达到最优为止 向后剔除法:从全变量回归方程开始,逐步删去某个变量,使指标值达到最优为止 逐步筛选法:综合上述两种方法 多元线性回归的核心问题 ...

  5. Tensorflow 线性回归预测房价实例

    在本节中将通过一个预测房屋价格的实例来讲解利用线性回归预测房屋价格,以及在tensorflow中如何实现 Tensorflow 线性回归预测房价实例 1.1. 准备工作 1.2. 归一化数据 1.3. ...

  6. C# chart.DataManipulator.FinancialFormula()公式的使用 线性回归预测方法

    最近翻阅资料,找到 chart.DataManipulator.FinancialFormula()公式的使用,打开另一扇未曾了解的窗,供大家分享一下. 一 DataManipulator类 运行时, ...

  7. python实现感知机线性分类模型

    前言 感知器是分类的线性分类模型,其中输入为实例的特征向量,输出为实例的类别,取+1或-1的值作为正类或负类.感知器对应于输入空间中对输入特征进行分类的超平面,属于判别模型. 通过梯度下降使误分类的损 ...

  8. 利用R进行多元线性回归分析

    对于一个因变量y,n个自变量x1,...,xn,要如何判断y与这n个自变量之间是否存在线性关系呢? 肯定是要利用他们的数据集,假设数据集中有m个样本,那么,每个样本都分别对应着一个因变量和一个n维的自 ...

  9. R与数据分析旧笔记(六)多元线性分析 上

    > x=iris[which(iris$Species=="setosa"),1:4] > plot(x) 首先是简单的肉眼观察数据之间相关性 多元回归相较于一元回归的 ...

随机推荐

  1. 【转】java io 流 设计模式

    知识点:什么是装饰模式: http://wenku.baidu.com/view/ad4eac9f51e79b896802263b.html(原理讲的很清楚) http://wenku.baidu.c ...

  2. Android后台处理最佳实践(Best Practices for Background Jobs)

    本课将告诉你如何通过后台加载来加速应用启动和降低应用耗电. 后台跑服务 除非你做了特殊指定,否则在应用中的大部分前台操作都是在一个特殊的UI线程里面进行的.这有可能会导致一些问题,因为长时间运行的操作 ...

  3. Android开发中的神坑和知识点记录

    1.SDK Manager.exe闪退的问题 http://blog.csdn.net/fambit025/article/details/26984345 1.找到android.bat,在源码处找 ...

  4. C#基础课程之二变量常量及流程控制

    课堂练习:.一个四位整数 输出它的千位,百位,十位,个位 数字. ; ; % ; % ; ; Console.WriteLine("千位数" + b+" 百位数" ...

  5. 浅析PCIe链路LTSSM状态机

    我们知道,在PCIe链路可以正常工作之前,需要对PCIe链路进行链路训练,在这个过程中,就会用LTSSM状态机.LTSSM全称是Link Training and Status State Machi ...

  6. vue2.0 组件化及组件传值

    组件 (Component) 是 Vue.js 最强大的功能之一.组件可以扩展 HTML 元素,封装可重用的代码.在较高层面上,组件是自定义元素,Vue.js 的编译器为它添加特殊功能.在有些情况下, ...

  7. vuex入门教程和思考

    Vuex是什么 首先对于vuex是什么,我先引用下官方的解释. Vuex 是一个专为 Vue.js 应用程序开发的状态管理模式.它采用集中式存储管理应用的所有组件的状态,并以相应的规则保证状态以一种可 ...

  8. 我的IT之路2013(二)

    严寒即将过去,温暖的春天正在向我们招手,欢呼吧,在迎接新的开始的同时,不要忘了回顾一下过去的这一年,总结一下过去的这一年有什么得失. 英语学习 13年下半年,最大的变化就是有很大一部分时间用来学英语. ...

  9. FFmpeg AVPacket和AVFrame区别

    简介 AVPacket:存储压缩数据(视频对应H.264等码流数据,音频对应AAC/MP3等码流数据)AVFrame:存储非压缩的数据(视频对应RGB/YUV像素数据,音频对应PCM采样数据)

  10. how many shards and replicas should be set for Elastic Search

    https://cpratt.co/how-many-shards-should-elasticsearch-indexes-have/ https://blog.trifork.com/2014/0 ...