Decision Tree builds classification or regression models in the form of a tree structure. It break down dataset into smaller and smaller subsets while an associated decision tree in incrementally developed at the same time.

Decision Tree learning use top-down recursive method. The basic idea is to construct one tree with a fastest declines of information entropy, the entropy value of all instance in each leaf nodes is zero. Each internal node of the tree corresponding to an attribute, and each leaf node corresponding to a class label.
Advantages:

  • Decision is easy to explain. It results in a set of rules. It is the same approach as humans generally follow while making decisions.
  • Interpretation of a complex Decision Tree can be simplified into visualization.It can be understood by everyone.
  • It almost have no hyper-parameter.

Infomation Gain

  • The entropy is:
  • By the information entropy, we can calculate their Experience entropy:

    where:
  • we can also calculate their Experience conditions entropy:
  • By the information entropy, we can calculate their information gain:
  • Information gain ratio:
  • Gini index:

    For binary classification:

    For binary classification and on the condition of feature A:

Three Building Algorithm

  • ID3: maximizing information gain
  • C4.5: maximizing the ratio of information gain
  • CART
    • Regression Tree: minimizing the square error.
    • Classification Tree: minimizing the Gini index.

Decision Tree Algorithm Pseudocode

  • Place the best attribute of the dataset at the root of tree.The way to the selection of best attribute is shown in Three Building Algorithm above.
  • Split the train set into subset by the best attribute.
  • Repeat Step 1 and Step 2 on each subset until you find leaf nodes in all the branches of the tree.

Random Forest

Random Forest classifiers work around that limitation by creating a whole bunch of decision trees(hence 'forest'), each trained on random subsets of training samples(bagging, drawn with replacement) and features(drawn without replacement).Make the decision tree work together to get result.
In one word, it build on CART with randomness.

  • Randomness 1:train the tree on the subsets of train set selected by bagging(sampling with replacement).

  • Randomness 2:train the tree on the subsets of features(sampling without replacement). For example, select 10 features from 100 features in dataset.
  • Randomness 3:add new feature by low-dimensional projection.

后记

装逼想用英文写博客,想借此锻炼自己的写作能力,无情打脸( ̄ε(# ̄)

Ref:https://clyyuanzi.gitbooks.io/julymlnotes/content/rf.html
http://www.saedsayad.com/decision_tree.htm
http://dataaspirant.com/2017/01/30/how-decision-tree-algorithm-works/
统计学习方法(李航)

Decision Tree的更多相关文章

  1. Spark MLlib - Decision Tree源码分析

    http://spark.apache.org/docs/latest/mllib-decision-tree.html 以决策树作为开始,因为简单,而且也比较容易用到,当前的boosting或ran ...

  2. 决策树Decision Tree 及实现

    Decision Tree 及实现 标签: 决策树熵信息增益分类有监督 2014-03-17 12:12 15010人阅读 评论(41) 收藏 举报  分类: Data Mining(25)  Pyt ...

  3. Gradient Boosting Decision Tree学习

    Gradient Boosting Decision Tree,即梯度提升树,简称GBDT,也叫GBRT(Gradient Boosting Regression Tree),也称为Multiple ...

  4. 使用Decision Tree对MNIST数据集进行实验

    使用的Decision Tree中,对MNIST中的灰度值进行了0/1处理,方便来进行分类和计算熵. 使用较少的测试数据测试了在对灰度值进行多分类的情况下,分类结果的正确率如何.实验结果如下. #Te ...

  5. Sklearn库例子1:Sklearn库中AdaBoost和Decision Tree运行结果的比较

    DisCrete Versus Real AdaBoost 关于Discrete 和Real AdaBoost 可以参考博客:http://www.cnblogs.com/jcchen1987/p/4 ...

  6. 用于分类的决策树(Decision Tree)-ID3 C4.5

    决策树(Decision Tree)是一种基本的分类与回归方法(ID3.C4.5和基于 Gini 的 CART 可用于分类,CART还可用于回归).决策树在分类过程中,表示的是基于特征对实例进行划分, ...

  7. OpenCV码源笔记——Decision Tree决策树

    来自OpenCV2.3.1 sample/c/mushroom.cpp 1.首先读入agaricus-lepiota.data的训练样本. 样本中第一项是e或p代表有毒或无毒的标志位:其他是特征,可以 ...

  8. GBDT(Gradient Boosting Decision Tree)算法&协同过滤算法

    GBDT(Gradient Boosting Decision Tree)算法参考:http://blog.csdn.net/dark_scope/article/details/24863289 理 ...

  9. Gradient Boost Decision Tree(&Treelink)

    http://www.cnblogs.com/joneswood/archive/2012/03/04/2379615.html 1.      什么是Treelink Treelink是阿里集团内部 ...

  10. (转)Decision Tree

    Decision Tree:Analysis 大家有没有玩过猜猜看(Twenty Questions)的游戏?我在心里想一件物体,你可以用一些问题来确定我心里想的这个物体:如是不是植物?是否会飞?能游 ...

随机推荐

  1. Mac下新安装的MySQL无法登陆root用户(安装时没有设置密码)

    1.也不知是何原因,新安装好的MySQL,如果尝试用mysql -u root -p登陆就会出现这样的错误,但是root用户根本就没有设置密码. $ cd /usr/local$ cd mysql $ ...

  2. 第二十一篇:基​于​W​D​M​模​型​的​A​V​S​t​r​e​a​m​驱​动​架​构​研​究

    基​于​W​D​M​模​型​的​A​V​S​t​r​e​a​m​驱​动​架​构​研​ 这篇论文2006年早就发表, 与当时开发这个驱动正好几乎相同的时间. 近期实际项目须要, 又回过头来将AVStre ...

  3. iOS中,MRC和ARC混编

    假设一个project为MRC,当中要加入ARC的文件: 选择target -> build phases -> compile sources -> 单击ARC的文件将compil ...

  4. Qt写入unicode编码格式的文本(用QChar写入BOM标记,并且列出所有Qt支持的字符集)

    1.文本流设置unicode小端模式 2.写入文本前两个字节FF FE 3.字符串转成unicode编码 QList<QByteArray> list = QTextCodec::avai ...

  5. go-wingui 2018 全新 v2.0 版本发布,包含重大更新!

    go-wingui 2018 全新 v2.0 版本发布,包含重大更新!使用新版CEF内核Chromium 63.0.3239.109,页面可以使用最新的css3,html5技术.使用delphi7重写 ...

  6. 彩色图像--色彩空间 RGB系列

    学习DIP第62天 转载请标明本文出处:http://blog.csdn.net/tonyshengtan ,出于尊重文章作者的劳动,转载请标明出处!文章代码已托管,欢迎共同开发:https://gi ...

  7. 百度地图 JavaScript API 极速版 开发体会

    前段时间百度地图API推出了 JavaScript API 极速版 1.0 简单看了一下,从产品定位来说真是挺好. 把开发人员细分成普通web开发人员和移动web开发人员.正好用到了手机地图这块决定尝 ...

  8. [LeetCode] Subsets [31]

    题目 Given a set of distinct integers, S, return all possible subsets. Note: Elements in a subset must ...

  9. 深入WPF中的图像画刷(ImageBrush)之1——ImageBrush使用举例

    原文:深入WPF中的图像画刷(ImageBrush)之1--ImageBrush使用举例 昨天我在<简述WPF中的画刷(Brush)  >中简要介绍了WPF中的画刷的使用.现在接着深入研究 ...

  10. 对scrollTop的研究

    本文主要从原生 JS以及jquery来说明scrollTop是如何实现的,以及一些技巧,以及在PC端和移动端使用的差异. 首先用代码表示下如何回到顶部的简单原理 <!doctype html&g ...