Logistic Regression 用于预测马是否生病
1.利用Logistic regression 进行分类的主要思想
根据现有数据对分类边界线建立回归公式,即寻找最佳拟合参数集,然后进行分类。
2.利用梯度下降找出最佳拟合参数
3.代码实现
# -*- coding: utf-8 -*-
"""
Created on Tue Mar 28 21:35:25 2017 @author: MyHome
"""
import numpy as np
from random import uniform
'''定义sigmoid函数'''
def sigmoid(inX):
return 1.0 /(1.0 +np.exp(-inX)) '''使用随机梯度下降更新权重,并返回最终值'''
def StocGradientDescent(dataMatrix,classLabels,numIter = 600):
m,n = dataMatrix.shape
#print m,n
weights = np.ones(n)
for j in xrange(numIter):
dataIndex = range(m) for i in xrange(m): alpha = 4 / (1.0+j+i) + 0.01
randIndex = int(uniform(0,len(dataIndex)))
h = sigmoid(sum(dataMatrix[randIndex]*weights))
gradient = (h - classLabels[randIndex])*dataMatrix[randIndex]
weights = weights - alpha*gradient
del(dataIndex[randIndex]) return weights '''创建分类器'''
def classifyVector(inX,weights):
prob = sigmoid(sum(inX*weights))
if prob > 0.5:
return 1.0
else:
return 0.0 '''测试'''
def Test(): frTrain = open("horseColicTraining.txt")
frTest = open("horseColicTest.txt")
trainingSet = []
trainingLabel = []
for line in frTrain.readlines():
currLine = line.strip().split("\t")
lineArr = []
for i in range(21):
lineArr.append(float(currLine[i]))
trainingSet.append(lineArr)
trainingLabel.append(float(currLine[21]))
trainWeights = StocGradientDescent(np.array(trainingSet),trainingLabel)
errorCount = 0.0
numTestVec = 0.0
for line in frTest.readlines():
numTestVec += 1.0
currLine = line.strip().split("\t")
lineArr = []
for i in range(21):
lineArr.append(float(currLine[i]))
if int(classifyVector(np.array(lineArr),trainWeights)) != int(currLine[21]):
errorCount += 1
errorRate = (float(errorCount)/numTestVec)
print "the error rate of this test is:%f"%errorRate
return errorRate '''调用Test()10次求平均值'''
def multiTest():
numTest = 10
errorSum = 0.0
for k in range(numTest):
errorSum += Test()
print "after %d iterations the average errror rate is:\
%f"%(numTest,errorSum/float(numTest)) if __name__ == "__main__":
multiTest()
结果:
the error rate of this test is:0.522388
the error rate of this test is:0.328358
the error rate of this test is:0.313433
the error rate of this test is:0.358209
the error rate of this test is:0.298507
the error rate of this test is:0.343284
the error rate of this test is:0.283582
the error rate of this test is:0.313433
the error rate of this test is:0.343284
the error rate of this test is:0.358209
after 10 iterations the average errror rate is: 0.346269
4.总结
Logistic regression is finding best-fit parameters to a nonlinear function called the sigmoid.
Methods of optimization can be used to find the best-fit parameters. Among theoptimization algorithms, one of the most common algorithms is gradient descent. Gradient
desent can be simplified with stochastic gradient descent.
Stochastic gradient descent can do as well as gradient descent using far fewer computing
resources. In addition, stochastic gradient descent is an online algorithm; it can
update what it has learned as new data comes in rather than reloading all of the data
as in batch processing.
One major problem in machine learning is how to deal with missing values in the
data. There’s no blanket answer to this question. It really depends on what you’re
doing with the data. There are a number of solutions, and each solution has its own
advantages and disadvantages.
Logistic Regression 用于预测马是否生病的更多相关文章
- Logistic回归应用-预测马的死亡率
Logistic回归应用-预测马的死亡率 本文所有代码均来自<机器学习实战>,数据也是 本例中的数据有以下几个特征: 部分指标比较主观.难以很好的定量测量,例如马的疼痛级别 数据集中有30 ...
- matlab(8) Regularized logistic regression : 不同的λ(0,1,10,100)值对regularization的影响,对应不同的decision boundary\ 预测新的值和计算模型的精度predict.m
不同的λ(0,1,10,100)值对regularization的影响\ 预测新的值和计算模型的精度 %% ============= Part 2: Regularization and Accur ...
- Machine Learning - 第3周(Logistic Regression、Regularization)
Logistic regression is a method for classifying data into discrete outcomes. For example, we might u ...
- Coursera公开课笔记: 斯坦福大学机器学习第六课“逻辑回归(Logistic Regression)” 清晰讲解logistic-good!!!!!!
原文:http://52opencourse.com/125/coursera%E5%85%AC%E5%BC%80%E8%AF%BE%E7%AC%94%E8%AE%B0-%E6%96%AF%E5%9D ...
- 机器学习理论基础学习3.3--- Linear classification 线性分类之logistic regression(基于经验风险最小化)
一.逻辑回归是什么? 1.逻辑回归 逻辑回归假设数据服从伯努利分布,通过极大化似然函数的方法,运用梯度下降来求解参数,来达到将数据二分类的目的. logistic回归也称为逻辑回归,与线性回归这样输出 ...
- SparkMLlib之 logistic regression源码分析
最近在研究机器学习,使用的工具是spark,本文是针对spar最新的源码Spark1.6.0的MLlib中的logistic regression, linear regression进行源码分析,其 ...
- Logistic Regression Vs Decision Trees Vs SVM: Part I
Classification is one of the major problems that we solve while working on standard business problem ...
- Logistic Regression逻辑回归
参考自: http://blog.sina.com.cn/s/blog_74cf26810100ypzf.html http://blog.sina.com.cn/s/blog_64ecfc2f010 ...
- 在opencv3中实现机器学习之:利用逻辑斯谛回归(logistic regression)分类
logistic regression,注意这个单词logistic ,并不是逻辑(logic)的意思,音译过来应该是逻辑斯谛回归,或者直接叫logistic回归,并不是什么逻辑回归.大部分人都叫成逻 ...
随机推荐
- Mac 及 Xcode快捷键
mac快捷键: 窗口最大化:control+command+F 窗口最小化:command+M 关闭当前: command+W 退出程序: command+Q Safari往下翻页:空格 ...
- CF 1088(A , B , C , D)——思路
http://codeforces.com/contest/1088 A:Ehab and another construction problem 输出 2 和 n(偶数的话)或者 2 和 n-1( ...
- HTTP API 设计指南
本指南描述了一系列 HTTP+JSON API 的设计实践, 来自并展开于 Heroku Platform API 的工作.本指南指导着Heroku内部API的开发,我们希望也能对Heroku以外的A ...
- Studio 3T 如何使用 Query Builder 查询数据
Studio 3T 是一款对 MongoDB 进行数据操作的可视化工具. 在 Studio 3T 中,我们可以借助 Query Builder 的 Drag & Drop 来构建查询条件. 具 ...
- JDK 8 - Lambda Expression 的优点与限制
我们知道 JDK 8 新增了 Lambda Expression 这一特性. JDK 8 为什么要新增这个特性呢? 这个特性给 JDK 8 带来了什么好处? 它可以做什么?不可以做什么? 在这篇文章, ...
- 数据分析与处理之二(Leveldb 实现原理)
郑重声明:本篇博客是自己学习 Leveldb 实现原理时参考了郎格科技系列博客整理的,原文地址:http://www.samecity.com/blog/Index.asp?SortID=12,只是为 ...
- ecshop移动端支付宝支付对接
初始页,提交基本信息到api页面, <?php /* * * 功能:支付宝手机网站支付接口接口调试入口页面 * 版本:3.4 * 修改日期:2016-03-08 * 说明: * 以下代码只是为了 ...
- QQ市场总监分享:黏住90后的独门攻略
转自:http://www.gameres.com/476003.html 90后的关键词 1. 品质生活 90后是怎么样的一群人?他们注重生活的品质. 他们比我们更爱享受,或者说他们不像我们一样认为 ...
- 基本的Ceph性能测试工具和方法
测试环境 1. 测试准备 1.1 磁盘读写性能 1.1.1 单个 OSD 磁盘写性能,大概 165MB/s. root@ceph1:~# echo 3 > /proc/sys/vm/drop_c ...
- DEV CheckComboboxEdit、CheckedListBoxControl(转)
CheckComboboxEdit //先清空所有,若在窗体Load事件中,也可以不清空 //cbRWYs.Properties.Items.Clear(); var RwyList = tspro. ...