refer to:

机器学习公开课笔记(5):神经网络(Neural Network)

CS224d笔记3——神经网络

深度学习与自然语言处理(4)_斯坦福cs224d 大作业测验1与解答

CS224d Problem set 1作业

softmax:

def softmax(x):

    assert len(x.shape) > 1
x -= np.max(x, axis=1, keepdims=True)
x = np.exp(x) / np.sum(np.exp(x), axis=1, keepdims=True) return x

sigmoid & sigmoid_grad:

def sigmoid(x):

    result = 1.0 / (1.0 + np.exp(-x))

    return result

def sigmoid_grad(f):

    f=f*(1.0-f)

    return f

gradcheck_naive:

def gradcheck_naive(f, x):
"""
Gradient check for a function f
- f should be a function that takes a single argument and outputs the
cost and its gradients
- x is the point (numpy array) to check the gradient at
""" rndstate = random.getstate()
random.setstate(rndstate)
fx, grad = f(x) # Evaluate function value at original point
h = 1e-4 # Iterate over all indexes in x
it = np.nditer(x, flags=['multi_index'], op_flags=['readwrite'])
while not it.finished:
ix = it.multi_index ### try modifying x[ix] with h defined above to compute numerical gradients
### make sure you call random.setstate(rndstate) before calling f(x) each
### time, this will make it
### possible to test cost functions with built in randomness later
### YOUR CODE HERE:
old_val = x[ix]
x[ix] = old_val - h
random.setstate(rndstate)
( fxh1, _ ) = f(x) x[ix] = old_val + h
random.setstate(rndstate)
( fxh2, _ ) = f(x) numgrad = (fxh2 - fxh1)/(2*h)
x[ix] = old_val
### END YOUR CODE # Compare gradients
reldiff = abs(numgrad - grad[ix]) / max(1, abs(numgrad), abs(grad[ix]))
if reldiff > 1e-5:
print "Gradient check failed."
print "First gradient error found at index %s" % str(ix)
print "Your gradient: %f \t Numerical gradient: %f" % (grad[ix], numgrad)
return it.iternext() # Step to next dimension print "Gradient check passed!"

neural.py

import numpy as np
import random from q1_softmax import softmax
from q2_sigmoid import sigmoid, sigmoid_grad
from q2_gradcheck import gradcheck_naive def forward_backward_prop(data, labels, params, dimensions):
"""
Forward and backward propagation for a two-layer sigmoidal network Compute the forward propagation and for the cross entropy cost,
and backward propagation for the gradients for all parameters.
""" ### Unpack network parameters (do not modify)
ofs = 0
Dx, H, Dy = (dimensions[0], dimensions[1], dimensions[2]) W1 = np.reshape(params[ofs:ofs+ Dx * H], (Dx, H))
ofs += Dx * H
b1 = np.reshape(params[ofs:ofs + H], (1, H))
ofs += H
W2 = np.reshape(params[ofs:ofs + H * Dy], (H, Dy))
ofs += H * Dy
b2 = np.reshape(params[ofs:ofs + Dy], (1, Dy)) N, D = data.shape # data --> N x D
# W1 --> D x H
# b1 --> 1 x H
# W2 --> H x V
# b2 --> 1 x V
# labels --> N x V ### YOUR CODE HERE: forward propagation
Z1 = np.dot(data, W1) + b1 # N x H
A1 = sigmoid(Z1) # N x H
Z2 = np.dot(A1, W2) + b2 # N x V
A2 = softmax(Z2) # N x V # cross entropy cost #first method
#B = np.exp(Z2) # N x V
#b = np.sum(B, axis=1) + 1e-8 # N x 1
#z = np.log(b) # N x 1
#cost = np.sum(z) - np.sum(Z2 * labels)
#cost /= N #second method
cost = - np.sum(np.log(A2[labels == 1]))/N
### END YOUR CODE
#cost = b2[0,-1] ### YOUR CODE HERE: backward propagation formula:
delta2 = A2 - labels # N x V delta2=A2-y
gradb2 = np.sum(delta2, axis=0) # 1 x V gradb2<--delta2
gradb2 /= N # 1 x V
gradW2 = np.dot(A1.T, delta2) # H x V gradW2=A1.T*delta2
gradW2 /= N # H x V
delta1 = sigmoid_grad(A1) * np.dot(delta2, W2.T)# N x H delta1=f'(A1)*delta2*W2.T
gradb1 = np.sum(delta1, axis=0) # 1 x H gradb1<--delta1
gradb1 /= N # 1 x H
gradW1 = np.dot(data.T, delta1) # D x H gradW1=X.T*delta1
gradW1 /= N # D x H
### END YOUR CODE ### Stack gradients (do not modify)
grad = np.concatenate((gradW1.flatten(), gradb1.flatten(),
gradW2.flatten(), gradb2.flatten())) return cost, grad def sanity_check():
"""
Set up fake data and parameters for the neural network, and test using
gradcheck.
"""
print "Running sanity check..." N = 20
dimensions = [10, 5, 10]
data = np.random.randn(N, dimensions[0]) # each row will be a datum 20*10
labels = np.zeros((N, dimensions[2]))
for i in xrange(N):
labels[i,random.randint(0,dimensions[2]-1)] = 1 #one-hot vector params = np.random.randn((dimensions[0] + 1) * dimensions[1] + (
dimensions[1] + 1) * dimensions[2], ) gradcheck_naive(lambda params: forward_backward_prop(data, labels, params,
dimensions), params) if __name__ == "__main__":
sanity_check()

CS224d assignment 1【Neural Network Basics】的更多相关文章

  1. 吴恩达《深度学习》-课后测验-第一门课 (Neural Networks and Deep Learning)-Week 2 - Neural Network Basics(第二周测验 - 神经网络基础)

    Week 2 Quiz - Neural Network Basics(第二周测验 - 神经网络基础) 1. What does a neuron compute?(神经元节点计算什么?) [ ] A ...

  2. 【Neural Network】林轩田机器学习技法

    首先从单层神经网络开始介绍 最简单的单层神经网络可以看成是多个Perception的线性组合,这种简单的组合可以达到一些复杂的boundary. 比如,最简单的逻辑运算AND  OR NOT都可以由多 ...

  3. Neural Network Basics

    在学习NLP之前还是要打好基础,第二部分就是神经网络基础. 知识点总结: 1.神经网络概要: 2. 神经网络表示: 第0层为输入层(input layer).隐藏层(hidden layer).输出层 ...

  4. 课程一(Neural Networks and Deep Learning),第二周(Basics of Neural Network programming)—— 1、10个测验题(Neural Network Basics)

    --------------------------------------------------中文翻译---------------------------------------------- ...

  5. 【DeepLearning学习笔记】Coursera课程《Neural Networks and Deep Learning》——Week2 Neural Networks Basics课堂笔记

    Coursera课程<Neural Networks and Deep Learning> deeplearning.ai Week2 Neural Networks Basics 2.1 ...

  6. XiangBai——【AAAI2017】TextBoxes_A Fast Text Detector with a Single Deep Neural Network

    XiangBai--[AAAI2017]TextBoxes:A Fast Text Detector with a Single Deep Neural Network 目录 作者和相关链接 方法概括 ...

  7. 论文阅读(Weilin Huang——【TIP2016】Text-Attentional Convolutional Neural Network for Scene Text Detection)

    Weilin Huang--[TIP2015]Text-Attentional Convolutional Neural Network for Scene Text Detection) 目录 作者 ...

  8. 论文阅读(Xiang Bai——【PAMI2017】An End-to-End Trainable Neural Network for Image-based Sequence Recognition and Its Application to Scene Text Recognition)

    白翔的CRNN论文阅读 1.  论文题目 Xiang Bai--[PAMI2017]An End-to-End Trainable Neural Network for Image-based Seq ...

  9. 【面向代码】学习 Deep Learning(三)Convolution Neural Network(CNN)

    ========================================================================================== 最近一直在看Dee ...

随机推荐

  1. Alpha阶段项目展示

    1.团队简介 韩青长 前端工程师 我是韩青长,技术小白,抱着对软工的好奇和对未来工作的憧憬选了这门课.暂时选择了测试的工作,也对开发和UI有一定兴趣.从前上帝创造了我们,现在轮到我们来创造自己的软件了 ...

  2. 在WildFly中运行多个standalone模式的实例

      WildFly作为一款优秀的EJB容器,其前身为JBoss AS.JBoss作为一款开源的应用服务器,被广泛的应用在各种项目当中.假设我们现在有这样一个项目,他是以standalone的模式运行在 ...

  3. Java Web编程技术学习要点及方向

    学习编程技术要点及方向亮点: 传统学习编程技术落后,应跟著潮流,要对业务聚焦处理.要Jar, 不要War:以小为主,以简为宝,集堆而成.去繁取简 Spring Boot,明日之春(future of ...

  4. 面试题目——《CC150》链表

    面试题2.1:编写代码,移除未排序链表中的重复结点 进阶:如果不得使用临时缓冲区,该怎么解决? package cc150; import java.util.HashMap; import java ...

  5. @Controller和@RestController的区别?

    @Controller和@RestController的区别?官方文档:@RestController is a stereotype annotation that combines @Respon ...

  6. (转载)JavaWeb学习总结(五十)——文件上传和下载

    源地址:http://www.cnblogs.com/xdp-gacl/p/4200090.html 在Web应用系统开发中,文件上传和下载功能是非常常用的功能,今天来讲一下JavaWeb中的文件上传 ...

  7. Less的简单使用

    在浏览器中使用LESSCSS 浏览器端使用是在使用LESS开发时最直观的一种方式.如果是在生产环境中,尤其是对性能要求比较高的场合,建议使用node或者其它第三方工具先编译成CSS再上线使用. 浏览器 ...

  8. Apple Watch版微信来了 收发微信刷朋友圈不在话下

    昨晚果粉守了一夜的Apple Watch发布会,意料中的惊喜不少,最让人兴奋的是微信成为首批支持的应用.是的,在全球拥有4.68亿月活跃用户的微信怎么可能不第一时间入驻呢?之前我们就有聊过Apple ...

  9. AJAX 请求区分 $_SERVER['HTTP_X_REQUESTED_WITH'] 小解

    关于这个内容,很多人都有所了解.但从我搜索的内容来看,他们只是略微看一下,根本不知道里面到底是什么情况. 受到很多模版代码的影响,大家都以为PHP有这样一个自定义变量:$_SERVER['HTTP_X ...

  10. MySQL Cluster 7.3.5 集群配置实例(入门篇)

    一.环境说明: CentOS6.3(32位) + MySQL Cluster 7.3.5,规划5台机器,资料如下: 节点分布情况: MGM:192.168.137. NDBD1:192.168.137 ...