AI-IBM-cognitive class --Liner Regression
Liner Regression
import matplotlib.pyplot as plt
import pandas as pd
import pylab as pl
import numpy as np
%matplotlib inline
%motib inline
%matplotlib作用
- 是在使用jupyter notebook 或者 jupyter qtconsole的时候,才会经常用到%matplotlib,
- 而%matplotlib具体作用是当你调用matplotlib.pyplot的绘图函数plot()进行绘图的时候,或者生成一个figure画布的时候,可以直接在你的python console里面生成图像。
在spyder或者pycharm实际运行代码的时候,可以注释掉这一句
下载数据包
!wget -O FuelConsumption.csv https://s3-api.us-geo.objectstorage.softlayer.net/cf-courses-data/CognitiveClass/ML0101ENv3/labs/FuelConsumptionCo2.csv
df = pd.read_csv("./FuelConsumptionCo2.csv") # use pandas to read csv file. # take a look at the dataset, show top 10 lines.
df.head(10)
out:
# summarize the data
print(df.describe())
使用describe函数进行表格的预处理,求出最大最小值,已经分比例的数据。
out:
进行表格的重新组合, 提取出我们关心的数据类型。
out:
cdf = df[['ENGINESIZE','CYLINDERS','FUELCONSUMPTION_COMB','CO2EMISSIONS','FUELCONSUMPTION_CITY']]
cdf.head(9)
每一列数据可生成hist(直方图)
viz = cdf[['CYLINDERS','ENGINESIZE','CO2EMISSIONS','FUELCONSUMPTION_COMB','FUELCONSUMPTION_CITY']]
viz.hist()
plt.show()
使用scatter生成散列图, 定义散列图的参数, 颜色
具体使用可参考连接:https://blog.csdn.net/qiu931110/article/details/68130199
plt.scatter(cdf.FUELCONSUMPTION_COMB, cdf.CO2EMISSIONS, color='blue')
plt.xlabel("FUELCONSUMPTION_COMB")
plt.ylabel("Emission")
plt.show()
选择表中len长度小于8的数据, 创建训练集合测试集,并生成散列图
Creating train and test dataset
Train/Test Split involves splitting the dataset into training and testing sets respectively, which are mutually exclusive. After which, you train with the training set and test with the testing set. This will provide a more accurate evaluation on out-of-sample accuracy because the testing dataset is not part of the dataset that have been used to train the data. It is more realistic for real world problems.
This means that we know the outcome of each data point in this dataset, making it great to test with! And since this data has not been used to train the model, the model has no knowledge of the outcome of these data points. So, in essence, it is truly an out-of-sample testing.
msk = np.random.rand(len(df)) < 0.8
train = cdf[msk]
test = cdf[~msk]
print(train)
print(test)
plt.scatter(train.ENGINESIZE, train.CO2EMISSIONS, color='blue')
plt.xlabel("Engine size")
plt.ylabel("Emission")
plt.show()
Modeling: Using sklearn package to model data.
from sklearn import linear_model
regr = linear_model.LinearRegression()
train_x = np.asanyarray(train[['ENGINESIZE']])
train_y = np.asanyarray(train[['CO2EMISSIONS']])
regr.fit (train_x, train_y)
# The coefficients
print ('Coefficients: ', regr.coef_)
print ('Intercept: ',regr.intercept_)
out:
Coefficients: [[39.64984954]]
Intercept: [124.08949291] As mentioned before, Coefficient and Intercept in the simple linear regression, are the parameters of the fit line. Given that it is a simple linear regression,
with only 2 parameters, and knowing that the parameters are the intercept and slope of the line, sklearn can estimate them directly from our data.
Notice that all of the data must be available to traverse and calculate the parameters.
plt.scatter(train.ENGINESIZE, train.CO2EMISSIONS, color='blue')
plt.plot(train_x, regr.coef_[0][0]*train_x + regr.intercept_[0], '-r')
# 通过斜率和截距画出线性回归曲线
plt.xlabel("Engine size")
plt.ylabel("Emission")
使用sklearn.linear_model.LinearRegression进行线性回归 参考以下连接:
https://www.cnblogs.com/magle/p/5881170.html
AI-IBM-cognitive class --Liner Regression的更多相关文章
- (三)用Normal Equation拟合Liner Regression模型
继续考虑Liner Regression的问题,把它写成如下的矩阵形式,然后即可得到θ的Normal Equation. Normal Equation: θ=(XTX)-1XTy 当X可逆时,(XT ...
- CS229 3.用Normal Equation拟合Liner Regression模型
继续考虑Liner Regression的问题,把它写成如下的矩阵形式,然后即可得到θ的Normal Equation. Normal Equation: θ=(XTX)-1XTy 当X可逆时,(XT ...
- (线性回归)Liner Regression简单应用
警告:本文为小白入门学习笔记 数据连接: http://openclassroom.stanford.edu/MainFolder/DocumentPage.php?course=DeepLearni ...
- (转)A curated list of Artificial Intelligence (AI) courses, books, video lectures and papers
A curated list of Artificial Intelligence (AI) courses, books, video lectures and papers. Updated 20 ...
- (四)Logistic Regression
1 线性回归 回归就是对已知公式的未知参数进行估计.线性回归就是对于多维空间中的样本点,用特征的线性组合去拟合空间中点的分布和轨迹,比如已知公式是y=a∗x+b,未知参数是a和b,利用多真实的(x,y ...
- 广义线性模型 GLM
Logistic Regression 同 Liner Regression 均属于广义线性模型,Liner Regression 假设 $y|x ; \theta$ 服从 Gaussian 分布,而 ...
- 决策树之 CART
继上篇文章决策树之 ID3 与 C4.5,本文继续讨论另一种二分决策树 Classification And Regression Tree,CART 是 Breiman 等人在 1984 年提出的, ...
- [machine learning] Loss Function view
[machine learning] Loss Function view 有关Loss Function(LF),只想说,终于写了 一.Loss Function 什么是Loss Function? ...
- 【转】Loss Function View
感谢原文作者!原文地址:http://eletva.com/tower/?p=186 一.Loss Function 什么是Loss Function?wiki上有一句解释我觉得很到位,引用一下:Th ...
随机推荐
- CSS3 结构性伪类选择器(2)
CSS3 结构性伪类选择器—first-child “:first-child”选择器表示的是选择父元素的第一个子元素的元素E.简单点理解就是选择元素中的第一个子元素,记住是子元素,而不是后代元素. ...
- 17 安全字符串 System.Security.SecureString
- NIO之FileChannel操作示例
1. 写文件操作 /** * 写文件 */ public class FileChannelTest { public static void main(String[] args) throws I ...
- BZOJ 4180: 字符串计数 后缀自动机 + 矩阵乘法 + 二分(神题)
Description SD有一名神犇叫做Oxer,他觉得字符串的题目都太水了,于是便出了一道题来虐蒟蒻yts1999. 他给出了一个字符串T,字符串T中有且仅有4种字符 'A', 'B', 'C ...
- OUC-NULL -凡事遇则立
[OUC-NULL-凡事遇则立] 一.项目的GITHUB地址 https://github.com/OUC-null/null- 二.对遇到的问题思考及总结 一开始进度较慢,大家一开始也没太找到前进的 ...
- [CSP-S模拟测试]:天空龙(模拟)
题目描述 奥西里斯之天空龙很喜欢颜色,有一天他找到了三种颜色——红黄蓝.奥西里斯有$a$个红色,$b$个黄色,$c$个蓝色,他想用画出最好的画,可是需要至少$x$个红色,$y$个黄色和$z$个蓝色,似 ...
- sql常用 语句总结
一,插入一个新字段:ALTER TABLE +表名+ADD COLUMN(字段名+ 类型) sql1 = 'ALTER TABLE klkl_4s_shop ADD COLUMN (name_rea ...
- h5分线程Worker
<!DOCTYPE html><html><head> <meta charset="UTF-8"> <title>Ti ...
- 史上最全USB HID开发资料
史上最全USB HID开发资料 史上最全USB HID开发资料,悉心整理一个月,亲自测试. 涉及STM32 C51 8051F例子都有源码,VC上位机例子以及源码,USB协议,HID协议,USB抓包工 ...
- docker 部署 tomcat
1.搜索tomcat信息 docker search tomcat 2.下拉tomcat 镜像 docker pull tomcat 3.运行tomcat docker run -d --name=t ...