Coursera Deep Learning 2 Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization - week2, Optimization algorithms
Gradient descent
Batch Gradient Decent, Mini-batch gradient descent, Stochastic gradient descent
还有很多比gradient decent 更优化的算法,在了解这些算法前,需要先理解 Exponentially weighted averages 这个概念
Exponentially weighted average 是一种计算平均值的方法,非常省storage 和 memory, 但是不是很精确。 然后引出一个bias correction 的概念,就是为了能使得 Exponentially weighted average 更加精确.
momentum (or called Gradient descent with momentum)
传统的Gradient descent 算法有如下图所示的问题 - 每次迭代都会来回跳动,不直接指向optimum, 在没有做feature scaling 的时候尤其明显。所以引出一个修正的算法 - Gradient descent with momentum.
RMSprop
目的和上面讲到的Momentum是一样的,就是使得每次迭代都尽量指向optimum而不是来回跳动. 算法实现如下. RMSprop带来的好处是迭代更快,和可以选用更大的learning rate.
Adam optimation algorithm:
结合了Momentum 和 RMSprop 两种算法. Adam stands for Adaptive mement estimation.
Learning rate decay
why? to reduce the oscillation near the central point.
有哪些实现方式呢?
Local optima and saddle point
在大型神经网络里,saddle point 可能比local optima更常见.
Ref:
Coursera, Deep leaning, Andrew Ng
Coursera Deep Learning 2 Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization - week2, Optimization algorithms的更多相关文章
- 《Improving Deep Neural Networks:Hyperparameter tuning, Regularization and Optimization》课堂笔记
Lesson 2 Improving Deep Neural Networks:Hyperparameter tuning, Regularization and Optimization 这篇文章其 ...
- [C4] Andrew Ng - Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization
About this Course This course will teach you the "magic" of getting deep learning to work ...
- Coursera Deep Learning 2 Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization - week1, Assignment(Initialization)
声明:所有内容来自coursera,作为个人学习笔记记录在这里. Initialization Welcome to the first assignment of "Improving D ...
- Coursera Deep Learning 2 Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization - week1, Assignment(Gradient Checking)
声明:所有内容来自coursera,作为个人学习笔记记录在这里. Gradient Checking Welcome to the final assignment for this week! In ...
- Coursera Deep Learning 2 Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization - week1, Assignment(Regularization)
声明:所有内容来自coursera,作为个人学习笔记记录在这里. Regularization Welcome to the second assignment of this week. Deep ...
- Coursera Deep Learning 2 Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization - week2, Assignment(Optimization Methods)
声明:所有内容来自coursera,作为个人学习笔记记录在这里. 请不要ctrl+c/ctrl+v作业. Optimization Methods Until now, you've always u ...
- 课程二(Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization),第一周(Practical aspects of Deep Learning) —— 4.Programming assignments:Gradient Checking
Gradient Checking Welcome to this week's third programming assignment! You will be implementing grad ...
- 吴恩达《深度学习》-课后测验-第二门课 (Improving Deep Neural Networks:Hyperparameter tuning, Regularization and Optimization)-Week 1 - Practical aspects of deep learning(第一周测验 - 深度学习的实践)
Week 1 Quiz - Practical aspects of deep learning(第一周测验 - 深度学习的实践) \1. If you have 10,000,000 example ...
- 吴恩达《深度学习》-第二门课 (Improving Deep Neural Networks:Hyperparameter tuning, Regularization and Optimization)-第一周:深度学习的实践层面 (Practical aspects of Deep Learning) -课程笔记
第一周:深度学习的实践层面 (Practical aspects of Deep Learning) 1.1 训练,验证,测试集(Train / Dev / Test sets) 创建新应用的过程中, ...
- 课程二(Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization),第三周(Hyperparameter tuning, Batch Normalization and Programming Frameworks) —— 2.Programming assignments
Tensorflow Welcome to the Tensorflow Tutorial! In this notebook you will learn all the basics of Ten ...
随机推荐
- ftp 两台服务器传输文件 apache
import java.io.File;import java.io.FileInputStream;import java.io.IOException;import java.io.OutputS ...
- 第五篇:数据备份、pymysql模块
http://www.cnblogs.com/linhaifeng/articles/7525619.html#_label3 一 IDE工具介绍 生产环境还是推荐使用mysql命令行,但为了方便我们 ...
- mybatis的一种批量更新方法【我】
接手一个项目,项目主要架构用的 servlet 3.0 + spring + mybatis 其中发现一个问题: 操作数据时,批量插入可以,批量更新,使用各种写法都无法成功,直接报 mybatis转换 ...
- TestNg 5.类分组
类分组是可以给类去分组,几个类分成不同的组. 比如,建立3个类GroupsOnClass1,GroupsOnClass2,GroupsOnClass3. GroupsOnClass1和Groups ...
- hdu 4279"Number"(数论)
传送门 参考资料: [1]:https://www.2cto.com/kf/201308/233613.html 题意,题解在上述参考资料中已经介绍的非常详细了,接下来的内容只是记录一下我的理解: 我 ...
- easyUI,重新渲染
Easyui中使用jquery或js动态添加元素时出现的样式失效的解决方法 可以使用$.parser.parse();这个方法进行处理: 例如: $.parser.parse(); 表示对整个页面重新 ...
- java中异常的面试
https://blog.csdn.net/qq_36523638/article/details/79363652 1) Java中的检查型异常和非检查型异常有什么区别? 这又是一个非常流行的Jav ...
- Rancher之HA部署
sudo mkdir -p /etc/docker sudo tee /etc/docker/daemon.json <<-'EOF' { "registry-mirrors&q ...
- JAVA核心技术I---JAVA基础知识(工具类Arrays和Collections类)
一:工具类 –不存储数据,而是在数据容器上,实现高效操作 • 排序 • 搜索 –Arrays类 –Collection类 二:Arrays类(处理数组) (一)基本方法 –排序:对数组排序, sort ...
- 设计模式---数据结构模式之迭代器模式(Iterate)
一:概念 迭代模式是行为模式之一,它把对容器中包含的内部对象的访问委让给外部类,使用Iterator(遍历)按顺序进行遍历访问的设计模式. 在应用Iterator模式之前,首先应该明白Iterator ...