The goal of backpropagation is to compute the partial derivatives ∂C/∂w and ∂C/∂b of the cost function C with respect to any weight ww or bias b in the network.

we use the quadratic cost function

   

two assumptions :

  1: The first assumption we need is that the cost function can be written as an average

        (case for the quadratic cost function)

    The reason we need this assumption is because what backpropagation actually lets us do is compute the partial derivatives

  ∂Cx/∂w and ∂Cx/∂b for a single training example. We then recover ∂C/∂w and ∂C/∂b by averaging over training examples. In

  fact, with this assumption in mind, we'll suppose the training example x has been fixed, and drop the x subscript, writing the

  cost Cx as C. We'll eventually put the x back in, but for now it's a notational nuisance that is better left implicit.

  2: The cost function can be written as a function of the outputs from the neural network

  

the Hadamard product

   (s⊙t)j=sjtj(s⊙t)j=sjtj

  

The four fundamental equations behind backpropagation

  

BP1 

   :the error in the jth neuron in the lth layer

     

    You might wonder why the demon is changing the weighted input zlj. Surely it'd be more natural to imagine the demon changing

   the output activation alj, with the result that we'd be using ∂C/∂alj as our measure of error. In fact, if you do this things work out quite

  similarly to the discussion below. But it turns out to make the presentation of backpropagation a little more algebraically complicated.

   So we'll stick with δlj=∂C/∂zlj as our measure of error.

  An equation for the error in the output layer, δL: The components of δL are given by

  

  it's easy to rewrite the equation in a matrix-based form, as

  

  

  

BP2

  

  

  

BP3

  

  

BP4

  

  

  

The backpropagation algorithm

  

    

      Of course, to implement stochastic gradient descent in practice you also need an outer loop generating mini-batches

    of training examples, and an outer loop stepping through multiple epochs of training. I've omitted those for simplicity.

reference: http://neuralnetworksanddeeplearning.com/chap2.html

------------------------------------------------------------------------------------------------

reference:Machine Learning byAndrew Ng

review backpropagation的更多相关文章

  1. (Review cs231n) Backpropagation and Neural Network

    损失由两部分组成: 数据损失+正则化损失(data loss + regularization) 想得到损失函数关于权值矩阵W的梯度表达式,然后进性优化操作(损失相当于海拔,你在山上的位置相当于W,你 ...

  2. A review of learning in biologically plausible spiking neural networks

    郑重声明:原文参见标题,如有侵权,请联系作者,将会撤销发布! Contents: ABSTRACT 1. Introduction 2. Biological background 2.1. Spik ...

  3. Deep Learning论文翻译(Nature Deep Review)

    原论文出处:https://www.nature.com/articles/nature14539 by Yann LeCun, Yoshua Bengio & Geoffrey Hinton ...

  4. 我们是怎么做Code Review的

    前几天看了<Code Review 程序员的寄望与哀伤>,想到我们团队开展Code Review也有2年了,结果还算比较满意,有些经验应该可以和大家一起分享.探讨.我们为什么要推行Code ...

  5. Code Review 程序员的寄望与哀伤

    一个程序员,他写完了代码,在测试环境通过了测试,然后他把它发布到了线上生产环境,但很快就发现在生产环境上出了问题,有潜在的 bug. 事后分析,是生产环境的一些微妙差异,使得这种 bug 场景在线下测 ...

  6. AutoMapper:Unmapped members were found. Review the types and members below. Add a custom mapping expression, ignore, add a custom resolver, or modify the source/destination type

    异常处理汇总-后端系列 http://www.cnblogs.com/dunitian/p/4523006.html 应用场景:ViewModel==>Mode映射的时候出错 AutoMappe ...

  7. Git和Code Review流程

    Code Review流程1.根据开发任务,建立git分支, 分支名称模式为feature/任务名,比如关于API相关的一项任务,建立分支feature/api.git checkout -b fea ...

  8. 神经网络与深度学习(3):Backpropagation算法

    本文总结自<Neural Networks and Deep Learning>第2章的部分内容. Backpropagation算法 Backpropagation核心解决的问题: ∂C ...

  9. 故障review的一些总结

    故障review的一些总结 故障review的目的 归纳出现故障产生的原因 检查故障的产生是否具有普遍性,并尽可能的保证同类问题不在出现, 回顾故障的处理流程,并检查处理过程中所存在的问题.并确定此类 ...

随机推荐

  1. Rational Rose 2003 下载、破解及安装方法(图文)

    方法一: 1. 安装Rational Rose2003时,在需选择安装项的时候,只选择Rational Rose EnterPrise Edition即可,不需选择其他项,之后选择“DeskTop I ...

  2. uuid 学习

    #include <vector> #include <iostream> #include <boost/uuid/uuid.hpp> #include < ...

  3. string类的常用功能演示

    这个程序可用随着我对string的用法的增多而有调整. /* 功能说明: string类的常用功能演示. 实现方式: 主要是演示string的常用函数的用法和它与字符数组的区别与联系 限制条件或者存在 ...

  4. Python3.7安装pyspider

    下面是Python3.7安装pyspider的方式,能安装成功但是后期有很多问题,所以不建议,请使用3.5版本的Python进行安装!!!由于要做爬虫工作,所以学习pyspider框架,下面介绍安装步 ...

  5. Cow Exhibition (背包中的负数问题)

    个人心得:背包,动态规划真的是有点模糊不清,太过于抽象,为什么有些是从后面递推, 有些状态就是从前面往后面,真叫人头大. 这一题因为涉及到负数,所以网上大神们就把开始位置从10000开始,这样子就转变 ...

  6. oracle获得当前时间,精确到毫秒并指定精确位数

    oracle获得当前时间的,精确到毫秒   可以指定精确豪秒的位数 select to_char(systimestamp, 'yyyymmdd hh24:mi:ss.ff ') from dual; ...

  7. 如何上Chrome谷歌商店

    将以下代码复制到本地Hosts中即可. #Google Services START64.233.162.83 0.docs.google.com64.233.162.83 0.drive.googl ...

  8. 洛谷【AT2827】LIS

    浅谈\(DP\):https://www.cnblogs.com/AKMer/p/10437525.html 题目传送门:https://www.luogu.org/problemnew/show/A ...

  9. zabbix3.2.1安装graphtrees插件

    https://blog.csdn.net/liang_baikai/article/details/53542317 graphtree介绍 由于zabbix的图像显示一块不太友好,图像没法集中显示 ...

  10. C# 常用文件操作

    public class IoHelper { /// <summary> /// 判断文件是否存在 /// </summary> /// <param name=&qu ...