What are the advantages of logistic regression over decision trees?FAQ
What are the advantages of logistic regression over decision trees?FAQ
The answer to "Should I ever use learning algorithm (a) over learning algorithm (b)" will pretty much always be yes. Different learning algorithms make different assumptions about the data and have different rates of convergence. The one which works best, i.e. minimizes some cost function of interest (cross validation for example) will be the one that makes assumptions that are consistent with the data and has sufficiently converged to its error rate.
Put in the context of decision trees vs. logistic regression, what are the assumptions made?
Decision trees assume that our decision boundaries are parallel to the axes, for example if we have two features (x1, x2) then it can only create rules such as x1>=4.5, x2>=6.5 etc. which we can visualize as lines parallel to the axis. We see this in practice in the diagram below.
So decision trees chop up the feature space into rectangles (or in higher dimensions, hyper-rectangles). There can be many partitions made and so decision trees naturally scale up to creating more complex (say, higher VC) functions - which can be a problem with over-fitting.
What assumptions does logistic regression make? Despite the probabilistic framework of logistic regression, all that logistic regression assumes is that there is one smooth linear decision boundary. It finds that linear decision boundary by making assumptions that the P(Y|X) of some form, like the inverse logit function applied to a weighted sum of our features. Then it finds the weights by a maximum likelihood approach.
However people get too caught up on that... The decision boundary it creates is a linear* decision boundary that can be of any direction. So if you have data where the decision boundary is not parallel to the axes,
then logistic regression picks it out pretty well, whereas a decision tree will have problems.
So in conclusion,
- Both algorithms are really fast. There isn't much to distinguish them in terms of run-time.
- Logistic regression will work better if there's a single decision boundary, not necessarily parallel to the axis.
- Decision trees can be applied to situations where there's not just one underlying decision boundary, but many, and will work best if the class labels roughly lie in hyper-rectangular regions.
- Logistic regression is intrinsically simple, it has low variance and so is less prone to over-fitting. Decision trees can be scaled up to be very complex, are are more liable to over-fit. Pruning is applied to avoid this.
Maybe you'll be left thinking, "I wish decision trees didn't have to create rules that are parallel to the axis." This motivates support vector machines.
Footnotes:
* linear in your covariates. If you include non-linear transformations or interactions then it will be non-linear in the space of those original covariates.
What are the advantages of logistic regression over decision trees?FAQ的更多相关文章
- Logistic Regression vs Decision Trees vs SVM: Part II
This is the 2nd part of the series. Read the first part here: Logistic Regression Vs Decision Trees ...
- Logistic Regression Vs Decision Trees Vs SVM: Part I
Classification is one of the major problems that we solve while working on standard business problem ...
- Stanford机器学习笔记-2.Logistic Regression
Content: 2 Logistic Regression. 2.1 Classification. 2.2 Hypothesis representation. 2.2.1 Interpretin ...
- [Scikit-learn] 1.1 Generalized Linear Models - Logistic regression & Softmax
二分类:Logistic regression 多分类:Softmax分类函数 对于损失函数,我们求其最小值, 对于似然函数,我们求其最大值. Logistic是loss function,即: 在逻 ...
- Logistic Regression and Gradient Descent
Logistic Regression and Gradient Descent Logistic regression is an excellent tool to know for classi ...
- Logistic Regression 用于预测马是否生病
1.利用Logistic regression 进行分类的主要思想 根据现有数据对分类边界线建立回归公式,即寻找最佳拟合参数集,然后进行分类. 2.利用梯度下降找出最佳拟合参数 3.代码实现 # -* ...
- 逻辑回归 Logistic Regression
逻辑回归(Logistic Regression)是广义线性回归的一种.逻辑回归是用来做分类任务的常用算法.分类任务的目标是找一个函数,把观测值匹配到相关的类和标签上.比如一个人有没有病,又因为噪声的 ...
- logistic regression与SVM
Logistic模型和SVM都是用于二分类,现在大概说一下两者的区别 ① 寻找最优超平面的方法不同 形象点说,Logistic模型找的那个超平面,是尽量让所有点都远离它,而SVM寻找的那个超平面,是只 ...
- Logistic Regression - Formula Deduction
Sigmoid Function \[ \sigma(z)=\frac{1}{1+e^{(-z)}} \] feature: axial symmetry: \[ \sigma(z)+ \sigma( ...
随机推荐
- SQL语句添加,删除主键
IF EXISTS (SELECT * FROM sys.all_objects WHERE type_desc= N'主键名')begin --删除主键 alter table 表名 drop ...
- presentViewController: 如何不覆盖原先的 viewController界面
PresentViewController 如何不遮挡住原来的viewController界面呢? 可能有时候会遇到这种需求,需要弹出一个功能比较独立的视图实现一些功能,但是却不想单纯添加一个View ...
- java比.net优美的一个小地方
用了四年的.net,今年转做java,内心一直吐槽java的烦琐,今天发现了一个java值得我为之点赞的地方 java的枚举居然可以这么玩,废话不多,上demo package com.sunline ...
- js实现跨域(jsonp, iframe+window.name, iframe+window.domain, iframe+window.postMessage)
一.浏览器同源策略 首先我们需要了解一下浏览器的同源策略,关于同源策略可以仔细看看知乎上的一个解释.传送门 总之:同协议,domain(或ip),同端口视为同一个域,一个域内的脚本仅仅具有本域内的权限 ...
- Ubuntu Android Studio/IntelliJ IDEA 支持文件中文命名
Android Studio 默认字体无法使用中文命名文件,中文显示空心方块,使用思源字体,可解析 下载思源字体http://www.cnblogs.com/icgq/p/4195347.html 选 ...
- 鼠标滑过提示title
<script type="text/javascript" src="http://ajax.googleapis.com/ajax/libs/jquery/1. ...
- LCD/LED/OLED/等离子显示器区别
LCD液晶显示器: LCD(Liquid Crystal Display),其构造是在两片平行的玻璃当中放置液态的晶体(液晶),在玻璃后面,以CCFL冷光灯管(类似日光灯)作背光源.液晶的成像原理可以 ...
- GridView编辑、取消按钮自定义控件
这个需求来自于论坛一位坛友提出的问题,他希望能够自定义编辑.取消按钮,而不是用GridView自带的编辑和取消.这里只当抛砖引玉,提出一些解决方案. 首先在页面前台设置一个GridView. < ...
- 配置ADB 工具 (Win7_64)
ADB (Android Debut Bridge) ADB这个工具, 让我们可以用电脑来操纵手机 Android studio 安装好之后在SDK 中就有ADB 但是我们想使用它还需要配置它的环境变 ...
- window.onbeforeunload 如果取消, 那么javascript变量会保存
function confirmQuit1() { if (ischanged) return 'it is changed !! '; else return 'no change .. '; } ...