What are the advantages of ReLU over sigmoid function in deep neural network?
The state of the art of non-linearity is to use ReLU instead of sigmoid function in deep neural network, what are the advantages?
I know that training a network when ReLU is used would be faster, and it is more biological inspired, what are the other advantages? (That is, any disadvantages of using sigmoid)?
Best answer in stackexchange:
Two additional major benefits of ReLUs are sparsity and a reduced likelihood of vanishing gradient. But first recall the definition of a ReLU is h=max(0,a)h=max(0,a) where a=Wx+ba=Wx+b.
One major benefit is the reduced likelihood of the gradient to vanish. This arises when a>0a>0. In this regime the gradient has a constant value. In contrast, the gradient of sigmoids becomes increasingly small as the absolute value of x increases. The constant gradient of ReLUs results in faster learning.
The other benefit of ReLUs is sparsity. Sparsity arises when a≤0a≤0. The more such units that exist in a layer the more sparse the resulting representation. Sigmoids on the other hand are always likely to generate some non-zero value resulting in dense representations. Sparse representations seem to be more beneficial than dense representations.
ReLU
ReLU的全称是rectified linear unit。上面的回答基本上涵盖了它胜过sigmoid function的几个方面:
- faster
- more biological inspired
- sparsity
- less chance of vanishing gradient (梯度消失问题)
早期使用sigmoid或tanh激活函数的DL在做unsupervised learning时因为 gradient vanishing problem 的问题会无法收敛。ReLU则这没有这个问题。
What are the advantages of ReLU over sigmoid function in deep neural network?的更多相关文章
- Sigmoid function in NN
X = [ones(m, ) X]; temp = X * Theta1'; t = size(temp, ); temp = [ones(t, ) temp]; h = temp * Theta2' ...
- S性能 Sigmoid Function or Logistic Function
S性能 Sigmoid Function or Logistic Function octave码 x = -10:0.1:10; y = zeros(length(x), 1); for i = 1 ...
- logistic function 和 sigmoid function
简单说, 只要曲线是 “S”形的函数都是sigmoid function: 满足公式<1>的形式的函数都是logistic function. 两者的相同点是: 函数曲线都是“S”形. ...
- Sigmoid Function
本系列文章由 @yhl_leo 出品,转载请注明出处. 文章链接: http://blog.csdn.net/yhl_leo/article/details/51734189 Sigmodi 函数是一 ...
- sigmoid function vs softmax function
DIFFERENCE BETWEEN SOFTMAX FUNCTION AND SIGMOID FUNCTION 二者主要的区别见于, softmax 用于多分类,sigmoid 则主要用于二分类: ...
- sigmoid function的直观解释
Sigmoid function也叫Logistic function, 在logistic regression中扮演将回归估计值h(x)从 [-inf, inf]映射到[0,1]的角色. 公式为: ...
- 神经网络中的激活函数具体是什么?为什么ReLu要好过于tanh和sigmoid function?(转)
为什么引入激活函数? 如果不用激励函数(其实相当于激励函数是f(x) = x),在这种情况下你每一层输出都是上层输入的线性函数,很容易验证,无论你神经网络有多少层,输出都是输入的线性组合,与没有隐藏层 ...
- ReLU 和sigmoid 函数对比
详细对比请查看:http://www.zhihu.com/question/29021768/answer/43517930 . 激活函数的作用: 是为了增加神经网络模型的非线性.否则你想想,没有激活 ...
- 小白学习之pytorch框架(5)-多层感知机(MLP)-(tensor、variable、计算图、ReLU()、sigmoid()、tanh())
先记录一下一开始学习torch时未曾记录(也未好好弄懂哈)导致又忘记了的tensor.variable.计算图 计算图 计算图直白的来说,就是数学公式(也叫模型)用图表示,这个图即计算图.借用 htt ...
随机推荐
- HDU 1160 FatMouse's Speed(要记录路径的二维LIS)
FatMouse's Speed Time Limit: 2000/1000 MS (Java/Others) Memory Limit: 65536/32768 K (Java/Others) ...
- HDU 4358 Boring counting(莫队+DFS序+离散化)
Boring counting Time Limit: 6000/3000 MS (Java/Others) Memory Limit: 98304/98304 K (Java/Others) ...
- find命令查找文件,并排除相应路径
find / -path "/exclude/" -prune -o -name "lsof" -print 查找根目录下文件,并排除/exclude路径
- git http\https\git免密设置记住用户名和密码的方法
设置记住密码(默认15分钟): git config --global credential.helper cache如果想自己设置时间,可以这样做: git config credential.he ...
- 64位win10系统无法安装.Net framework3.5的解决方法
前言 Win10不会自带安装.net framework 3.5,而很多软件必须基于.net framework 3.5 sp1(例如:Sql Server 2014),从官网上下载的.net fra ...
- iScroll.js和swiper.js
最近系统地学习了iScroll.js和swiper.js,感觉它们在移动端特别好用:http://www.360doc.com/content/14/0724/11/16276861_39669990 ...
- Python基础、 内置函数
一.概述 Python中内置了很多函数: 可以通过help().dir()方式查看函数的功能,使用内置函数通常效率更高 abs() abs函数接收一个数字对象,返回它的绝对值,如果接受的对象不是数字抛 ...
- Aggregate
对序列应用累加器函数. /// <summary> /// 计算校验和,SUM /// </summary> public byte CalculateCheckSum(byt ...
- html+css+javascript实现列表循环滚动示例代码
使用html+css+javascript实现列表循环滚动,设置时间定时,在规定的时间内替换前一个节点的内容,具体示例如下,感兴趣的朋友可以参考下 说明:设置时间定时,在规定的时间内替换前一个节点的内 ...
- 用"僵尸对象"调试内存管理问题
Cocoa提供了"僵尸对象"(Zombie Object)这个功能.启用这项调试功能之后,运行时系统会把所有已经回收的实例转化成特殊的"僵尸对象",而不会真正回 ...