Practical aspects of deep learning】的更多相关文章

第一周:深度学习的实用层面(Practical aspects of Deep Learning) 训练,验证,测试集(Train / Dev / Test sets) 本周,我们将继续学习如何有效运作神经网络,内容涉及超参数调优,如何构建数据,以及如何确保优化算法快速运行,从而使学习算法在合理时间内完成自我学习.第一周,我们首先说说神经网络机器学习中的问题,然后是随机失活神经网络,还会学习一些确保神经网络正确运行的技巧,带着这些问题,我们开始今天的课程. 在配置训练.验证和测试数据集的过程中做…
第一周:深度学习的实践层面 (Practical aspects of Deep Learning) 1.1 训练,验证,测试集(Train / Dev / Test sets) 创建新应用的过程中,不可能从一开始就准确预测出一些信息和其他超级参数,例如:神经网络分多少层:每层含有多少个隐藏单元:学习速率是多少:各层采用哪些激活函数.应用型机器学习是一个高度迭代的过程. 从一个领域或者应用领域得来的直觉经验,通常无法转移到其他应用领域,最佳决策取决于 所拥有的数据量,计算机配置中输入特征的数量,…
Week 1 Quiz - Practical aspects of deep learning(第一周测验 - 深度学习的实践) \1. If you have 10,000,000 examples, how would you split the train/dev/test set? (如果你有 10,000,000 个样本,你会如何划分训练/开发/测试集?) [ ]98% train . 1% dev . 1% test(训练集占 98% , 开发集占 1% , 测试集占 1%) 答案…
Gradient Checking Welcome to this week's third programming assignment! You will be implementing gradient checking to make sure that your backpropagation implementation is correct. By completing this assignment you will: - Implement gradient checking…
1. Setting up your Machine Learning Application 1.1 训练,验证,测试集(Train / Dev / Test sets) 1.2 Bias/Variance(偏差和方差) 高偏差(high bias)称为"欠拟合"(underfitting), 练集误差与验证集误差都高. 高方差(high variance)称为过拟合(overfitting), 训练集误差很低而验证集误差很高. 1.3 Basic "recipe"…
If your Neural Network model seems to have high variance, what of the following would be promising things to try? Make the Neural Network deeper N Get more training data Y Get more test data N Add regularization Y Increase the number of units in each…
About this Course If you want to break into cutting-edge AI, this course will help you do so. Deep learning engineers are highly sought after, and mastering deep learning will give you numerous new career opportunities. Deep learning is also a new "s…
Jeremy Howard 在业界可谓大名鼎鼎.他是大数据竞赛平台 Kaggle 的前主席和首席科学家.他本人还是 Kaggle 的冠军选手.他是美国奇点大学(Singularity University)最年轻的教职工.曾于 2014 年,作为全球青年领袖,在达沃斯论坛上发表主题演讲.他在 TED 上的演讲 The wonderful and terrifying implications of computers that can learn 收获高达 200 万的点击.同时,他还创立了 E…
Why Deep Learning Works – Key Insights and Saddle Points A quality discussion on the theoretical motivations for deep learning, including distributed representation, deep architecture, and the easily escapable saddle point. By Matthew Mayo. This post…
最近在学深度学习相关的东西,在网上搜集到了一些不错的资料,现在汇总一下: Free Online Books  by Yoshua Bengio, Ian Goodfellow and Aaron Courville Neural Networks and Deep Learning42 by Michael Nielsen Deep Learning27 by Microsoft Research Deep Learning Tutorial23 by LISA lab, University…