refer to:

机器学习公开课笔记(5):神经网络(Neural Network)

CS224d笔记3——神经网络

深度学习与自然语言处理(4)_斯坦福cs224d 大作业测验1与解答

CS224d Problem set 1作业

softmax:

  1. def softmax(x):
  2.  
  3. assert len(x.shape) > 1
  4. x -= np.max(x, axis=1, keepdims=True)
  5. x = np.exp(x) / np.sum(np.exp(x), axis=1, keepdims=True)
  6.  
  7. return x

sigmoid & sigmoid_grad:

  1. def sigmoid(x):
  2.  
  3. result = 1.0 / (1.0 + np.exp(-x))
  4.  
  5. return result
  6.  
  7. def sigmoid_grad(f):
  8.  
  9. f=f*(1.0-f)
  10.  
  11. return f

gradcheck_naive:

  1. def gradcheck_naive(f, x):
  2. """
  3. Gradient check for a function f
  4. - f should be a function that takes a single argument and outputs the
  5. cost and its gradients
  6. - x is the point (numpy array) to check the gradient at
  7. """
  8.  
  9. rndstate = random.getstate()
  10. random.setstate(rndstate)
  11. fx, grad = f(x) # Evaluate function value at original point
  12. h = 1e-4
  13.  
  14. # Iterate over all indexes in x
  15. it = np.nditer(x, flags=['multi_index'], op_flags=['readwrite'])
  16. while not it.finished:
  17. ix = it.multi_index
  18.  
  19. ### try modifying x[ix] with h defined above to compute numerical gradients
  20. ### make sure you call random.setstate(rndstate) before calling f(x) each
  21. ### time, this will make it
  22. ### possible to test cost functions with built in randomness later
  23. ### YOUR CODE HERE:
  24. old_val = x[ix]
  25. x[ix] = old_val - h
  26. random.setstate(rndstate)
  27. ( fxh1, _ ) = f(x)
  28.  
  29. x[ix] = old_val + h
  30. random.setstate(rndstate)
  31. ( fxh2, _ ) = f(x)
  32.  
  33. numgrad = (fxh2 - fxh1)/(2*h)
  34. x[ix] = old_val
  35. ### END YOUR CODE
  36.  
  37. # Compare gradients
  38. reldiff = abs(numgrad - grad[ix]) / max(1, abs(numgrad), abs(grad[ix]))
  39. if reldiff > 1e-5:
  40. print "Gradient check failed."
  41. print "First gradient error found at index %s" % str(ix)
  42. print "Your gradient: %f \t Numerical gradient: %f" % (grad[ix], numgrad)
  43. return
  44.  
  45. it.iternext() # Step to next dimension
  46.  
  47. print "Gradient check passed!"

neural.py

  1. import numpy as np
  2. import random
  3.  
  4. from q1_softmax import softmax
  5. from q2_sigmoid import sigmoid, sigmoid_grad
  6. from q2_gradcheck import gradcheck_naive
  7.  
  8. def forward_backward_prop(data, labels, params, dimensions):
  9. """
  10. Forward and backward propagation for a two-layer sigmoidal network
  11.  
  12. Compute the forward propagation and for the cross entropy cost,
  13. and backward propagation for the gradients for all parameters.
  14. """
  15.  
  16. ### Unpack network parameters (do not modify)
  17. ofs = 0
  18. Dx, H, Dy = (dimensions[0], dimensions[1], dimensions[2])
  19.  
  20. W1 = np.reshape(params[ofs:ofs+ Dx * H], (Dx, H))
  21. ofs += Dx * H
  22. b1 = np.reshape(params[ofs:ofs + H], (1, H))
  23. ofs += H
  24. W2 = np.reshape(params[ofs:ofs + H * Dy], (H, Dy))
  25. ofs += H * Dy
  26. b2 = np.reshape(params[ofs:ofs + Dy], (1, Dy))
  27.  
  28. N, D = data.shape
  29.  
  30. # data --> N x D
  31. # W1 --> D x H
  32. # b1 --> 1 x H
  33. # W2 --> H x V
  34. # b2 --> 1 x V
  35. # labels --> N x V
  36.  
  37. ### YOUR CODE HERE: forward propagation
  38. Z1 = np.dot(data, W1) + b1 # N x H
  39. A1 = sigmoid(Z1) # N x H
  40. Z2 = np.dot(A1, W2) + b2 # N x V
  41. A2 = softmax(Z2) # N x V
  42.  
  43. # cross entropy cost
  44.  
  45. #first method
  46. #B = np.exp(Z2) # N x V
  47. #b = np.sum(B, axis=1) + 1e-8 # N x 1
  48. #z = np.log(b) # N x 1
  49. #cost = np.sum(z) - np.sum(Z2 * labels)
  50. #cost /= N
  51.  
  52. #second method
  53. cost = - np.sum(np.log(A2[labels == 1]))/N
  54. ### END YOUR CODE
  55. #cost = b2[0,-1]
  56.  
  57. ### YOUR CODE HERE: backward propagation formula:
  58. delta2 = A2 - labels # N x V delta2=A2-y
  59. gradb2 = np.sum(delta2, axis=0) # 1 x V gradb2<--delta2
  60. gradb2 /= N # 1 x V
  61. gradW2 = np.dot(A1.T, delta2) # H x V gradW2=A1.T*delta2
  62. gradW2 /= N # H x V
  63. delta1 = sigmoid_grad(A1) * np.dot(delta2, W2.T)# N x H delta1=f'(A1)*delta2*W2.T
  64. gradb1 = np.sum(delta1, axis=0) # 1 x H gradb1<--delta1
  65. gradb1 /= N # 1 x H
  66. gradW1 = np.dot(data.T, delta1) # D x H gradW1=X.T*delta1
  67. gradW1 /= N # D x H
  68. ### END YOUR CODE
  69.  
  70. ### Stack gradients (do not modify)
  71. grad = np.concatenate((gradW1.flatten(), gradb1.flatten(),
  72. gradW2.flatten(), gradb2.flatten()))
  73.  
  74. return cost, grad
  75.  
  76. def sanity_check():
  77. """
  78. Set up fake data and parameters for the neural network, and test using
  79. gradcheck.
  80. """
  81. print "Running sanity check..."
  82.  
  83. N = 20
  84. dimensions = [10, 5, 10]
  85. data = np.random.randn(N, dimensions[0]) # each row will be a datum 20*10
  86. labels = np.zeros((N, dimensions[2]))
  87. for i in xrange(N):
  88. labels[i,random.randint(0,dimensions[2]-1)] = 1 #one-hot vector
  89.  
  90. params = np.random.randn((dimensions[0] + 1) * dimensions[1] + (
  91. dimensions[1] + 1) * dimensions[2], )
  92.  
  93. gradcheck_naive(lambda params: forward_backward_prop(data, labels, params,
  94. dimensions), params)
  95.  
  96. if __name__ == "__main__":
  97. sanity_check()

CS224d assignment 1【Neural Network Basics】的更多相关文章

  1. 吴恩达《深度学习》-课后测验-第一门课 (Neural Networks and Deep Learning)-Week 2 - Neural Network Basics(第二周测验 - 神经网络基础)

    Week 2 Quiz - Neural Network Basics(第二周测验 - 神经网络基础) 1. What does a neuron compute?(神经元节点计算什么?) [ ] A ...

  2. 【Neural Network】林轩田机器学习技法

    首先从单层神经网络开始介绍 最简单的单层神经网络可以看成是多个Perception的线性组合,这种简单的组合可以达到一些复杂的boundary. 比如,最简单的逻辑运算AND  OR NOT都可以由多 ...

  3. Neural Network Basics

    在学习NLP之前还是要打好基础,第二部分就是神经网络基础. 知识点总结: 1.神经网络概要: 2. 神经网络表示: 第0层为输入层(input layer).隐藏层(hidden layer).输出层 ...

  4. 课程一(Neural Networks and Deep Learning),第二周(Basics of Neural Network programming)—— 1、10个测验题(Neural Network Basics)

    --------------------------------------------------中文翻译---------------------------------------------- ...

  5. 【DeepLearning学习笔记】Coursera课程《Neural Networks and Deep Learning》——Week2 Neural Networks Basics课堂笔记

    Coursera课程<Neural Networks and Deep Learning> deeplearning.ai Week2 Neural Networks Basics 2.1 ...

  6. XiangBai——【AAAI2017】TextBoxes_A Fast Text Detector with a Single Deep Neural Network

    XiangBai--[AAAI2017]TextBoxes:A Fast Text Detector with a Single Deep Neural Network 目录 作者和相关链接 方法概括 ...

  7. 论文阅读(Weilin Huang——【TIP2016】Text-Attentional Convolutional Neural Network for Scene Text Detection)

    Weilin Huang--[TIP2015]Text-Attentional Convolutional Neural Network for Scene Text Detection) 目录 作者 ...

  8. 论文阅读(Xiang Bai——【PAMI2017】An End-to-End Trainable Neural Network for Image-based Sequence Recognition and Its Application to Scene Text Recognition)

    白翔的CRNN论文阅读 1.  论文题目 Xiang Bai--[PAMI2017]An End-to-End Trainable Neural Network for Image-based Seq ...

  9. 【面向代码】学习 Deep Learning(三)Convolution Neural Network(CNN)

    ========================================================================================== 最近一直在看Dee ...

随机推荐

  1. grails框架的g:paginate分页标签的使用

    我用到的grails是2.4.4. 该版本下游一个标签g:paginate 该标签下有以下几个参数:total(必须要填写的项).controller.action.prev.max.offset等等 ...

  2. Flyme适配源码更新命令,轻松完成打包

    第一次已经同步了所有源码(花了很长时间),第一次已经连接手机进行了插桩和解reject,那么第二次还需要这么麻烦吗?答案是:NO ! 1.官方源码,执行如下命令可以实现: repo sync -c 2 ...

  3. java高新技术-操作javaBean

    1. 对javaBean的简单内省操作 public class IntroSpectorTest { public static void main(String[] args) throws Ex ...

  4. Linux 远程复制文件

    Linux 远程复制文件 如果想把机器A上面的dir目录下面的所有文件复制到机器B的dir目录下,我们可以使用nc命令来完成 在机器A的dir目录下面执行: tar -czf - * | nc -l ...

  5. Python之路【第十九篇】自定义分页实现(模块化)

    自定义分页 1.目的&环境准备 目的把分页写成一个模块的方式然后在需要分页的地方直接调用模块就行了. 环境准备Django中生成一个APP并且注册,配置URL&Views 配置URL ...

  6. 1JavaEE应用简介----青软S2SH(笔记)

    这本书主要是讲解Struts2,spring,Hibernate框架的, 因为工作中用的较多的是SpringMVC,Struts2用的较少,所以想系统学习一下,就买了这本书. 这本书是青软的,虽然是培 ...

  7. 【11-23】mysql学习笔记02

    SQL的历史 SQL是Structed Query Language 的缩写,即”结构化查询语言” SQL原名是 sequel,后来由于法律原因,改名 SQL最早可以追溯到 1974 年,源于 IBM ...

  8. 关于Spring中的<context:annotation-config/>配置

    当我们需要使用BeanPostProcessor时,直接在Spring配置文件中定义这些Bean显得比较笨拙,例如: 使用@Autowired注解,必须事先在Spring容器中声明AutowiredA ...

  9. R的卸载和更新安装

      R包经常会遇到各种版本不兼容的毛病,比如当前的版本相较于包,新了/旧了都是麻烦 而升级R软件呢,最麻烦的就是之前安装的包怎么办? 搜罗了以下几种方法:   方法1: (1)直接安装新版本 (2)然 ...

  10. 简单的方向传感器SimpleOrientationSensor

    SimpleOrientationSensor是一个简单的方向传感器.能够识别手机如下表的6种方向信息: SimpleOrientation枚举变量 方向 NotRotated 设备未旋转 Rotat ...