Sequence Models 笔记(一)】的更多相关文章

2 Natural Language Processing & Word Embeddings 2.1 Word Representation(单词表达) vocabulary,每个单词可以使用1-hot表示,写作\(O^{5391}\)之类,上标可以变.只是用1-hot,不能知道任意两个单词的关系,例如man/woman;king/queen;apple/orange. 特征化表示:词嵌入(Featurized representation:word embedding).一个特征,使用-1到…
1 Recurrent Neural Networks(循环神经网络) 1.1 序列数据 输入或输出其中一个或两个是序列构成.例如语音识别,自然语言处理,音乐生成,感觉分类,dna序列,机器翻译,视频状态识别,名称识别. 1.2 Notation(符号) \(x ^ { ( i ) < t > }\)表示第\(i\)个训练样本输入的第\(t\)个元素 \(T ^ { ( i ) < t > } _ x\)表示第\(i\)个训练样本输入的长度为\(t\) \(y ^ { ( i )…
Lesson 5 Sequence Models 这篇文章其实是 Coursera 上吴恩达老师的深度学习专业课程的第五门课程的课程笔记. 参考了其他人的笔记继续归纳的. 符号定义 假如我们想要建立一个能够自动识别句中人名地名等位置的序列模型,也就是一个命名实体识别问题,这常用于搜索引擎.命名实体识别系统可以用来查找不同类型的文本中的人名.公司名.时间.地点.国家名和货币名等等. 我们输入语句 "Harry Potter and Herminoe Granger invented a new s…
第三周 序列模型和注意力机制(Sequence models & Attention mechanism) 3.1 序列结构的各种序列(Various sequence to sequence architectures) 首先,我们先建立一个网络,这个网络叫做编码网络(encoder network)(上图编号 1 所示),它是一个 RNN 的结构, RNN 的单元可以是 GRU 也可以是 LSTM.每次只向该网络中输入一个法语单词,将输入序列接收完毕后,这个 RNN 网络会输出一个向量来代表…
第一周 循环序列模型(Recurrent Neural Networks) 1.1 为什么选择序列模型?(Why Sequence Models?) 1.2 数学符号(Notation) 这个输入数据是 9 个单词组成的序列,所以会有 9 个特征集和来表示这 9 个 单词,并按序列中的位置进行索引,用\(…
Neural Machine Translation Welcome to your first programming assignment for this week! You will build a Neural Machine Translation (NMT) model to translate human readable dates ("25th of June, 2009") into machine readable dates ("2009-06-25…
Sequence to Sequence models basic sequence-to-sequence model: basic image-to-sequence or called image captioning model: but there are some differences between how you write a model like this to generate a sequence, compared to how you were synthesizi…
Sequence Models This is the fifth and final course of the deep learning specialization at Coursera which is moderated by deeplearning.ai Here are the course summary as its given on the course link: This course will teach you how to build models for n…
LSTM’s in Pytorch Example: An LSTM for Part-of-Speech Tagging Exercise: Augmenting the LSTM part-of-speech tagger with character-level features Sequence models are central to NLP: they are models where there is some sort of dependence through time be…
第三周 序列模型和注意力机制(Sequence models & Attention mechanism) 基础模型(Basic Models) 在这一周,你将会学习 seq2seq(sequence to sequence)模型,从机器翻译到语音识别,它们都能起到很大的作用,从最基本的模型开始.之后你还会学习集束搜索(Beam search)和注意力模型(Attention Model),一直到最后的音频模型,比如语音. 现在就开始吧,比如你想通过输入一个法语句子,比如这句 "Jane…