The Backpropagation Algorithm
https://page.mi.fu-berlin.de/rojas/neural/chapter/K7.pdf
7.1 Learning as gradient descent We saw in the last chapter that multilayered networks are capable of computing a wider range of Boolean functions than networks with a single layer of computing units. However the computational effort needed for finding the correct combination of weights increases substantially when more parameters and more complicated topologies are considered. In this chapter we discuss a popular learning method capable of handling such large learning problems — the backpropagation algorithm. This numerical method was used by different research communities in different contexts, was discovered and rediscovered, until in 1985 it found its way into connectionist AI mainly through the work of the PDP group [382]. It has been one of the most studied and used algorithms for neural networks learning ever since. In this chapter we present a proof of the backpropagation algorithm based on a graphical approach in which the algorithm reduces to a graph labeling problem. This method is not only more general than the usual analytical derivations, which handle only the case of special network topologies, but also much easier to follow. It also shows how the algorithm can be efficiently implemented in computing systems.
The optimization algorithm repeats a two phase cycle, propagation and weight update. When an input vector is presented to the network, it is propagated forward through the network, layer by layer, until it reaches the output layer. The output of the network is then compared to the desired output, using a loss function. The resulting error value is calculated for each of the neurons in the output layer. The error values are then propagated from the output back through the network, until each neuron has an associated error value that reflects its contribution to the original output. Backpropagation uses these error values to calculate the gradient of the loss function. In the second phas
e, this gradient is fed to the optimization method, which in turn uses it to update the weights, in an attempt to minimize the loss function.
The Backpropagation Algorithm的更多相关文章
- CheeseZH: Stanford University: Machine Learning Ex4:Training Neural Network(Backpropagation Algorithm)
1. Feedforward and cost function; 2.Regularized cost function: 3.Sigmoid gradient The gradient for t ...
- BP反向传播算法的工作原理How the backpropagation algorithm works
In the last chapter we saw how neural networks can learn their weights and biases using the gradient ...
- 反向传播算法 Backpropagation Algorithm
假设我们有一个固定样本集,它包含 个样例.我们可以用批量梯度下降法来求解神经网络.具体来讲,对于单个样例(x,y),其代价函数为:这是一个(二分之一的)方差代价函数.给定一个包含 个样例的数据集,我们 ...
- 神经网络(9)--如何求参数: backpropagation algorithm(反向传播算法)
Backpropagation algorithm(反向传播算法) Θij(l) is a real number. Forward propagation 上图是给出一个training examp ...
- Feedforward and BackPropagation Algorithm
在下图所示的Neural Network中,我们将拥有三个节点的layer1及layer4分别称为输入和输出层,而中间的两层layer2,layer3称为隐藏层(hidden layer).输入数据X ...
- 一文弄懂神经网络中的反向传播法(Backpropagation algorithm)
最近在看深度学习的东西,一开始看的吴恩达的UFLDL教程,有中文版就直接看了,后来发现有些地方总是不是很明确,又去看英文版,然后又找了些资料看,才发现,中文版的译者在翻译的时候会对省略的公式推导过程进 ...
- [Converge] Backpropagation Algorithm
Ref: CS231n Winter 2016: Lecture 4: Backpropagation Ref: How to implement a NN:中文翻译版本 Ref: Jacobian矩 ...
- (六) 6.2 Neurons Networks Backpropagation Algorithm
今天得主题是BP算法.大规模的神经网络可以使用batch gradient descent算法求解,也可以使用 stochastic gradient descent 算法,求解的关键问题在于求得每层 ...
- 吴恩达机器学习笔记30-神经网络的反向传播算法(Backpropagation Algorithm)
之前我们在计算神经网络预测结果的时候我们采用了一种正向传播方法,我们从第一层开始正向一层一层进行计算,直到最后一层的ℎ
随机推荐
- 【Java面试题】21 Java中的异常处理机制的简单原理和应用。
异常指Java程序运行时(非编译)所发生的非正常情况或错误. java对异常进行了分类,不同类型的异常使用了不同的java类,所有异常的根类为java.lang.Throwable.Throwable ...
- Xcode 5: 将新项目同步到Svn上
stackoverflow 两种办法,一是使用比较成熟的svn客户端,二是使用终端.以下为终端方法: 假设已经通过Xcode->Preferences->Accounts将reposito ...
- linux环境中,nginx安装过程
需求描述: 记录在linux平台,nginx安装的过程. 环境描述: 操作系统:Red Hat Enterprise Linux Server release 6.6 (Santiago) 操作内核版 ...
- 第四章 Spring.Net 如何管理您的类___让对象了解自己的容器
我们在开发中,经常需要让对象了解自己所在的容器的信息,例如,有时我们需要让对象知道,对象所在容器的引用是什么,或者是对象在容器中的名称是什么 .Spring.Net 中提供了两个接口,我们使用这两个接 ...
- 利用Sharepoint 创建轻量型应用之基本功能配置!
博客同步课程.假设你想跟着视频学习,请跟着例如以下视频: http://edu.csdn.net/course/detail/2097 1. 点击安装程序,出现的界面先期安装完毕准备工具,准备工具 ...
- Extjs3.4--TabpanelDemo
Ext.onReady(function () { var tab = new Ext.TabPanel({ renderTo: Ext.getBody(), height: 500 }) tab.a ...
- ios开发之--iOS 11适配:iOS11导航栏返回偏移
UIBarButtonItem 左边间隙过大,解决方案(ios11之前): 调用下面的方法,设置negativeSpacer.width = -15;就可以解决间隙过大的问题: UIBarButton ...
- YARN 中的应用程序提交
YARN 中的应用程序提交 本节讨论在应用程序提交到 YARN 集群时,ResourceManager.ApplicationMaster.NodeManagers 和容器如何相互交互.下图显示了一个 ...
- vertical-align负值和margin-bottom负值的区别
先看一下vertical-align在W3C当中的值有哪一些: 可是它有数值这一说确实很少提起,我们来看这么一段代码: <!DOCTYPE html> <html lang=&quo ...
- Mybatis之typeAlias配置的3种方法
1.定义别名: <typeAliases> <typeAlias alias="User" type="cn.lxc.vo.User" /&g ...