[Machine Learning][The Analytics Edge][Predicting Earnings from Census Data]
census = read.csv("census.csv")
library(caTools)
set.seed(2000)
spl = sample.split(census$over50k,SplitRatio = 0.6)
train = subset(census,spl == TRUE)
test = subset(census, spl == FALSE)
# use the logistic regression
glm = glm(over50k ~. , data = train, family = "binomial")
summary(glm) #pr(>|z|) if it is smaller than 0.1, the variables are significant
#accuracy
glm.pred = predict(glm, newdata = test, type = "response")
table(test$over50k,glm.pred >= 0.5)
(9051+1888)/nrow(test)
#baseline accuracy of test - more frequent outcome
table(test$over50k)
9713/nrow(test)
#ROC & ACU
library(ROCR)
#Then we can generate the confusion matrix
ROCpred = prediction(glm.pred, test$over50k)
plot(performance(ROCpred,measure="tpr",x.measure="fpr"),colorize = TRUE)
as.numeric(performance(ROCpred, "auc")@y.values)
#Problem 2.1 - A CART Model
library(rpart)
library(rpart.plot)
CTree = rpart(over50k ~. , data = train, method = "class")
prp(CTree)
# accuracy of the CART model
CTree.pred = predict(CTree, newdata = test, type = "class")
table(test$over50k,CTree.pred)
(9243+1596)/nrow(test)
#use another way- generate probabilities and use a threshold of 0.5 like in logistic regression
CTree.pred1 = predict(CTree, newdata = test)
p = CTree.pred1[,2] # the column of over 50k
table(test$over50k, p) # p<=0.5 it is same with the <=50k, p>0.5 means >50k
# ROC curve for the CART model - WOW
#removing the type="class" argument when making predictions
library(ROCR)
library(arulesViz)
CTree.ROCpred = prediction(CTree.pred1[,2],test$over50k)
# plot(CTree.ROCpred) can not run
plot(performance(CTree.ROCpred,measure="tpr",x.measure="fpr"),colorize = TRUE)
# to caculate the auc
as.numeric(performance(CTree.ROCpred,"auc")@y.values)
# another way to seek for auc
CTree.ROCpred2 = prediction(p,test$over50k)
as.numeric(performance(CTree.ROCpred2,"auc")@y.values)
#Problem 3.1 - A Random Forest Model
set.seed(1)
trainSmall = train[sample(nrow(train),2000),]
set.seed(1)
library(randomForest)
RFC = randomForest(over50k ~., data = trainSmall)
RFC.pred = predict(RFC,newdata = test) #using a threshold of 0.5, no need to set the type = "class"
table(test$over50k,RFC.pred)
(9586+1093)/nrow(test) # a little difference is allowed
#compute metrics that give us insight into which variables are important.
vu = varUsed(RFC, count = TRUE)
vusorted = sort(vu, decreasing = FALSE, index.return = TRUE)
dotchart(vnsorted$x, names(RFC$forest$xlevel[vusorted$ix]))
#another way to find the important variables - impurity
varImpPlot(RFC)
# select cp by Cross-validation for the CART Trees
library(caret)
library(e1071)
set.seed(2)
#Specify that we are going to use k-fold cross validation with 10 folds:
numFolds = trainControl(method = "cv", number = 10)
#Specify the grid of cp values that we wish to evaluate:
cartGrid = expand.grid(.cp = seq(0.002,0.1,0.002))
#run the train function and view the result:
tr = train(over50k ~.,data = train, method = "rpart", trControl = numFolds, tuneGrid = cartGrid)
tr # The final value used for the model was cp = 0.002.
CTree2 = rpart(over50k ~., data = train, method = "class", cp = 0.002)
CTree2.pred = predict(CTree2, newdata = test, type = "class")
table(test$over50k, CTree2.pred)
(9178+1838)/nrow(test)
prp(CTree2) # shoould be 18 splits
[Machine Learning][The Analytics Edge][Predicting Earnings from Census Data]的更多相关文章
- Machine Learning for Developers
Machine Learning for Developers Most developers these days have heard of machine learning, but when ...
- How do I learn machine learning?
https://www.quora.com/How-do-I-learn-machine-learning-1?redirected_qid=6578644 How Can I Learn X? ...
- Course Machine Learning Note
Machine Learning Note Introduction Introduction What is Machine Learning? Two definitions of Machine ...
- [C2P3] Andrew Ng - Machine Learning
##Advice for Applying Machine Learning Applying machine learning in practice is not always straightf ...
- Why The Golden Age Of Machine Learning is Just Beginning
Why The Golden Age Of Machine Learning is Just Beginning Even though the buzz around neural networks ...
- Introducing: Machine Learning in R(转)
Machine learning is a branch in computer science that studies the design of algorithms that can lear ...
- Azure Machine Learning
About me In my spare time, I love learning new technologies and going to hackathons. Our hackathon p ...
- Getting started with machine learning in Python
Getting started with machine learning in Python Machine learning is a field that uses algorithms to ...
- Google's Machine Learning Crash Course #01# Introducing ML & Framing & Fundamental terminology
INDEX Introducing ML Framing Fundamental machine learning terminology Introducing ML What you learn ...
随机推荐
- 使用pm2来保证Spring Boot应用稳定运行
Spring Boot开发web应用就像开发普通的java程序一般简洁,因为其内嵌了web容易,启动的时候只需要一条命令java -jar server.jar即可,非常方便.但是由此而来的问题是万一 ...
- 深度森林DeepForest
级联森林(Cascade Forest) 级联森林结构的图示.级联的每个级别包括两个随机森林(蓝色字体标出)和两个完全随机树木森林(黑色). 假设有三个类要预测,因此,每个森林将输出三维类向量,然后将 ...
- java效率取随机不重复数
//效率取随机不重复数 public int[] takeRandom(int num) { Random rd = new Random(); int[] rds = new int[num];// ...
- DeepLearning初窥门径
说明: 最近在看Ng的DL课程,感觉说的非常好,浅显易懂! 本来打算记录一下自己的学习过程,网上几个大神总结的太完美了,根本没必要自己去写了,而且浪费时间~~ 网易地址:http://mooc.stu ...
- 字符串String的API
字符串的理解 1. 字符串的属性 str.length 2. 字符串的方法 charAt() charCodeAt() indexOf() lastIndexOf() slice() substr ...
- 引用全局变量global
lang = Lang.chn def set_lang(lang_type): global lang lang = lang_type
- i386 x86_64 armv7 arm64
arm7: Used in the oldest iOS 7-supporting devices arm7s: As used in iPhone 5 and 5C arm64: For the 6 ...
- JavaScript: Constructor and Object Oriented Programming
Constructor : Grammar: object.constructor Example: Javascript code: 1 function obj1() { this.number ...
- 吴裕雄 python深度学习与实践(13)
import numpy as np import matplotlib.pyplot as plt x_data = np.random.randn(10) print(x_data) y_data ...
- 编程语言的分类及其优缺点,Python标准输入与输出
一. 编程语言分类 1.机器语言 以0,1的组合作为指令集,用二进制指令来编写程序: 优点:执行效率高 缺点:开发效率低 2.汇编语言 用英文标签代替二进制指令集来编写程序,比机器语言稍微高级,但本质 ...