simpleRNN

训练集为《爱丽丝梦境》英文版txt文档,目标:根据随机给出的10个字符,生成可能的后100个字符

词向量空间生产

In [4]: INPUT_FILE = "./data/alice_in_wonderland.txt"

In [5]: fin = open(INPUT_FILE, 'rb')
...: lines = []
...: for line in fin:
...: line = line.strip().lower()
...: line = line.decode("ascii", "ignore")
...: if len(line) == 0:
...: continue
...: lines.append(line)
...: fin.close()
...: text = " ".join(lines)
...: In [6]: lines[:20]
Out[6]:
['down, down, down. there was nothing else to do, so alice soon began',
'talking again. "dinah\'ll miss me very much to-night, i should think!"',
'(dinah was the cat.) "i hope they\'ll remember her saucer of milk at',
'tea-time. dinah, my dear, i wish you were down here with me! there are',
"no mice in the air, i'm afraid, but you might catch a bat, and that's",
'very like a mouse, you know. but do cats eat bats, i wonder?" and here',
'alice began to get rather sleepy, and went on saying to herself, in a',
'dreamy sort of way, "do cats eat bats? do cats eat bats?" and sometimes,',
'"do bats eat cats?" for, you see, as she couldn\'t answer either',
"question, it didn't much matter which way she put it. she felt that she",
'was dozing off, and had just begun to dream that she was walking hand in',
'hand with dinah, and saying to her very earnestly, "now, dinah, tell me',
'the truth: did you ever eat a bat?" when suddenly, thump! thump! down',
'she came upon a heap of sticks and dry leaves, and the fall was over.',
'alice was not a bit hurt, and she jumped up on to her feet in a moment:',
'she looked up, but it was all dark overhead; before her was another long',
'passage, and the white rabbit was still in sight, hurrying down it.',
'there was not a moment to be lost: away went alice like the wind, and',
'was just in time to hear it say, as it turned a corner, "oh my ears and',
'whiskers, how late it\'s getting!" she was close behind it when she'] In [7]: text[:10]
Out[7]: 'down, down' In [8]: chars = set([c for c in text])
...: nb_chars = len(chars)
...: char2index = dict((c, i) for i, c in enumerate(chars))
...: index2char = dict((i, c) for i, c in enumerate(chars))
...: In [9]: nb_chars
Out[9]: 57 In [11]: char2index
Out[11]:
{' ': 49,
'!': 40,
'"': 4,
'$': 52,
'%': 28,
'&': 30,
"'": 17,
'(': 5,
')': 12,
'*': 21,
',': 0,
'-': 13,
'.': 45,
'/': 50,
'0': 51,
'1': 2,
'2': 16,
'3': 15,
'4': 54,
'5': 25,
'6': 48,
'7': 35,
'8': 37,
'9': 32,
':': 39,
';': 10,
'?': 29,
'@': 53,
'[': 11,
']': 47,
'_': 20,
'a': 24,
'b': 26,
'c': 34,
'd': 38,
'e': 27,
'f': 44,
'g': 23,
'h': 41,
'i': 18,
'j': 8,
'k': 7,
'l': 56,
'm': 1,
'n': 22,
'o': 6,
'p': 3,
'q': 14,
'r': 36,
's': 33,
't': 31,
'u': 9,
'v': 42,
'w': 19,
'x': 46,
'y': 43,
'z': 55} In [12]: len(text)
Out[12]: 159777 In [14]: SEQLEN = 10
...: STEP = 1
...:
...: input_chars = []
...: label_chars = []
...: for i in range(0, len(text) - SEQLEN, STEP):
...: input_chars.append(text[i:i + SEQLEN])
...: label_chars.append(text[i + SEQLEN])
...: In [15]: input_chars[:10]
Out[15]:
['down, down',
'own, down,',
'wn, down, ',
'n, down, d',
', down, do',
' down, dow',
'down, down',
'own, down.',
'wn, down. ',
'n, down. t'] In [16]: label_chars[:10]
Out[16]: [',', ' ', 'd', 'o', 'w', 'n', '.', ' ', 't', 'h'] In [17]: len(text)
Out[17]: 159777 In [18]: len(input_chars)
Out[18]: 159767 In [19]: len(label_chars)
Out[19]: 159767 In [20]: t=np.zeros((10,3,3)) In [21]: t
Out[21]:
array([[[0., 0., 0.],
[0., 0., 0.],
[0., 0., 0.]], [[0., 0., 0.],
[0., 0., 0.],
[0., 0., 0.]], [[0., 0., 0.],
[0., 0., 0.],
[0., 0., 0.]], [[0., 0., 0.],
[0., 0., 0.],
[0., 0., 0.]], [[0., 0., 0.],
[0., 0., 0.],
[0., 0., 0.]], [[0., 0., 0.],
[0., 0., 0.],
[0., 0., 0.]], [[0., 0., 0.],
[0., 0., 0.],
[0., 0., 0.]], [[0., 0., 0.],
[0., 0., 0.],
[0., 0., 0.]], [[0., 0., 0.],
[0., 0., 0.],
[0., 0., 0.]], [[0., 0., 0.],
[0., 0., 0.],
[0., 0., 0.]]]) In [22]: t=np.zeros((10,3)) In [23]: t
Out[23]:
array([[0., 0., 0.],
[0., 0., 0.],
[0., 0., 0.],
[0., 0., 0.],
[0., 0., 0.],
[0., 0., 0.],
[0., 0., 0.],
[0., 0., 0.],
[0., 0., 0.],
[0., 0., 0.]]) In [24]: X = np.zeros((len(input_chars), SEQLEN, nb_chars), dtype=np.bool)
...: y = np.zeros((len(input_chars), nb_chars), dtype=np.bool)
...: for i, input_char in enumerate(input_chars):
...: for j, ch in enumerate(input_char):
...: X[i, j, char2index[ch]] = 1
...: y[i, char2index[label_chars[i]]] = 1
...: In [25]: X[0]
Out[25]:
array([[False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, True, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False],
[False, False, False, False, False, False, True, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False],
[False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, True, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False],
[False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, True, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False],
[ True, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False],
[False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, True, False, False, False, False,
False, False, False],
[False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, True, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False],
[False, False, False, False, False, False, True, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False],
[False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, True, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False],
[False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, True, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False]]) In [26]: X[0].shape
Out[26]: (10, 57) In [27]: input_chars[10]
Out[27]: ', down. th'

模型训练与预测

(base) C:\Users\杨景\Desktop\keras深度学习实战\DeepLearningwithKeras_Code\Chapter06>ipython
Python 3.6.4 |Anaconda, Inc.| (default, Jan 16 2018, 10:22:32) [MSC v.1900 64 bit (AMD64)]
Type 'copyright', 'credits' or 'license' for more information
IPython 6.2.1 -- An enhanced Interactive Python. Type '?' for help. In [1]: input_chars[:10]
---------------------------------------------------------------------------
NameError Traceback (most recent call last)
<ipython-input-1-15f893c11699> in <module>()
----> 1 input_chars[:10] NameError: name 'input_chars' is not defined In [2]: %run alice_chargen_rnn.py
F:\ana\lib\site-packages\h5py\__init__.py:36: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.
from ._conv import register_converters as _register_converters
Using TensorFlow backend.
Extracting text from input...
Creating input and label text...
Vectorizing input and label text...
==================================================
Iteration #: 0
Epoch 1/1
159767/159767 [==============================] - 29s 179us/step - loss: 2.3886
Generating from seed: d all the
d all the sor the she she she she she she she she she she she she she she she she she she she she she she she
==================================================
Iteration #: 1
Epoch 1/1
159767/159767 [==============================] - 26s 162us/step - loss: 2.0846
Generating from seed: t no restr
t no restre the wast on the sart in the sart in the sart in the sart in the sart in the sart in the sart in th
==================================================
Iteration #: 2
Epoch 1/1
159767/159767 [==============================] - 26s 162us/step - loss: 1.9825
Generating from seed: al damages
al damages an and the har her and the said the hat ere had alice and the dore to the dore to the dore to the d
==================================================
Iteration #: 3
Epoch 1/1
159767/159767 [==============================] - 26s 162us/step - loss: 1.8993
Generating from seed: rom being
rom being the mouse and the moute to the more tore to the more tore to the more tore to the more tore to the m
==================================================
Iteration #: 4
Epoch 1/1
159767/159767 [==============================] - 22s 136us/step - loss: 1.8309
Generating from seed: said alic
said alice, and she had fore the said to the king to ghe sore to the king to ghe sore to the king to ghe sore
==================================================
Iteration #: 5
Epoch 1/1
159767/159767 [==============================] - 22s 138us/step - loss: 1.7758
Generating from seed: l, if i mu
l, if i must the couster had her head with a little some of her head with a little some of her head with a lit
==================================================
Iteration #: 6
Epoch 1/1
159767/159767 [==============================] - 25s 156us/step - loss: 1.7290
Generating from seed: d up and r
d up and repponting to see to se project gutenberg-tm the sabe the could alice and the doon a little so the co
==================================================
Iteration #: 7
Epoch 1/1
159767/159767 [==============================] - 21s 129us/step - loss: 1.6894
Generating from seed: ows on it,
ows on it, and this a could not mest were not in a little she had see sous for and whin she had see sous for a
==================================================
Iteration #: 8
Epoch 1/1
159767/159767 [==============================] - 11s 67us/step - loss: 1.6551
Generating from seed: the botto
the botton with a little said to her find it the dormouse said the dormouse said the dormouse said the dormou
==================================================
Iteration #: 9
Epoch 1/1
159767/159767 [==============================] - 10s 64us/step - loss: 1.6249
Generating from seed: atures, wh
atures, what i sand the mork of the ont of the same the court and a little she said to herself a little she sa
==================================================
Iteration #: 10
Epoch 1/1
159767/159767 [==============================] - 10s 64us/step - loss: 1.5991
Generating from seed: en leaves
en leaves in a little word that she was now she was she was she was she was she was she was she was she was sh
==================================================
Iteration #: 11
Epoch 1/1
159767/159767 [==============================] - 10s 64us/step - loss: 1.5769
Generating from seed: oject gute
oject gutenberg-tm ate of the gryphon. "the king to her head the dormouse was a little and alice was not got t
==================================================
Iteration #: 12
Epoch 1/1
159767/159767 [==============================] - 10s 65us/step - loss: 1.5563
Generating from seed: "that's v
"that's very such a plowers the rabbit her feet the rabbit her feet the rabbit her feet the rabbit her feet t
==================================================
Iteration #: 13
Epoch 1/1
159767/159767 [==============================] - 10s 65us/step - loss: 1.5385
Generating from seed: ee the ear
ee the earing of the the great comation of the words of the toment of she the hatter. "i can't alice was not i
==================================================
Iteration #: 14
Epoch 1/1
159767/159767 [==============================] - 10s 65us/step - loss: 1.5226
Generating from seed: ng is, to
ng is, to the growing to the had hear hear hear hear hear hear hear hear hear hear hear hear hear hear hear he
==================================================
Iteration #: 15
Epoch 1/1
159767/159767 [==============================] - 10s 64us/step - loss: 1.5077
Generating from seed: " alice we
" alice were out it was a little she had net of the tome with a little she had net of the tome with a little s
==================================================
Iteration #: 16
Epoch 1/1
159767/159767 [==============================] - 10s 64us/step - loss: 1.4954
Generating from seed: r in a lan
r in a lanter alice was not a sing to the mock turtle so the caterpillar a cance of the conter alice was not a
==================================================
Iteration #: 17
Epoch 1/1
159767/159767 [==============================] - 10s 64us/step - loss: 1.4826
Generating from seed: nd - if yo
nd - if you dread a remesting and the project gutenberg-tm electronic works the project gutenberg-tm electroni
==================================================
Iteration #: 18
Epoch 1/1
159767/159767 [==============================] - 10s 65us/step - loss: 1.4732
Generating from seed: in bringin
in bringing the looked down at the mock turtle so mech a little so me went on the looked down at the mock turt
==================================================
Iteration #: 19
Epoch 1/1
159767/159767 [==============================] - 10s 64us/step - loss: 1.4613
Generating from seed: onour!" "d
onour!" "do you don't like that it was a rear the words to be a little she had been work the rabbit the conter
==================================================
Iteration #: 20
Epoch 1/1
159767/159767 [==============================] - 10s 65us/step - loss: 1.4520
Generating from seed: other par
other parted to be so one of the same the dormouse she heard a comply and alice was a little she had the dorm
==================================================
Iteration #: 21
Epoch 1/1
159767/159767 [==============================] - 10s 65us/step - loss: 1.4439
Generating from seed: n] "and ju
n] "and just as the hatter with the tome with the tome with the tome with the tome with the tome with the tome
==================================================
Iteration #: 22
Epoch 1/1
159767/159767 [==============================] - 10s 66us/step - loss: 1.4351
Generating from seed: med to be
med to be alice, "what so the dormouse said to herself in a little so the dormouse said to herself in a little
==================================================
Iteration #: 23
Epoch 1/1
159767/159767 [==============================] - 10s 64us/step - loss: 1.4285
Generating from seed: as it spo
as it spoke thing as she could be and the caterpillar of the court. "what said to herself an all this again t
==================================================
Iteration #: 24
Epoch 1/1
159767/159767 [==============================] - 10s 65us/step - loss: 1.4214
Generating from seed: her somet
her something the momert like the mouse of a from and she thought the moment she thought the moment she thoug

模型源码:

 -*- coding: utf-8 -*-
# Adapted from lstm_text_generation.py in keras/examples
from __future__ import print_function
from keras.layers.recurrent import SimpleRNN
from keras.models import Sequential
from keras.layers import Dense, Activation
import numpy as np INPUT_FILE = "./data/alice_in_wonderland.txt" # extract the input as a stream of characters
print("Extracting text from input...")
fin = open(INPUT_FILE, 'rb')
lines = []
for line in fin:
line = line.strip().lower()
line = line.decode("ascii", "ignore")
if len(line) == 0:
continue
lines.append(line)
fin.close()
text = " ".join(lines) # creating lookup tables
# Here chars is the number of features in our character "vocabulary"
chars = set([c for c in text])
nb_chars = len(chars)
char2index = dict((c, i) for i, c in enumerate(chars))
index2char = dict((i, c) for i, c in enumerate(chars)) # create inputs and labels from the text. We do this by stepping
# through the text ${step} character at a time, and extracting a
# sequence of size ${seqlen} and the next output char. For example,
# assuming an input text "The sky was falling", we would get the
# following sequence of input_chars and label_chars (first 5 only)
# The sky wa -> s
# he sky was ->
# e sky was -> f
# sky was f -> a
# sky was fa -> l
print("Creating input and label text...")
SEQLEN = 10
STEP = 1 input_chars = []
label_chars = []
for i in range(0, len(text) - SEQLEN, STEP):
input_chars.append(text[i:i + SEQLEN])
label_chars.append(text[i + SEQLEN]) # vectorize the input and label chars
# Each row of the input is represented by seqlen characters, each
# represented as a 1-hot encoding of size len(char). There are
# len(input_chars) such rows, so shape(X) is (len(input_chars),
# seqlen, nb_chars).
# Each row of output is a single character, also represented as a
# dense encoding of size len(char). Hence shape(y) is (len(input_chars),
# nb_chars).
print("Vectorizing input and label text...")
X = np.zeros((len(input_chars), SEQLEN, nb_chars), dtype=np.bool)
y = np.zeros((len(input_chars), nb_chars), dtype=np.bool)
for i, input_char in enumerate(input_chars):
for j, ch in enumerate(input_char):
X[i, j, char2index[ch]] = 1
y[i, char2index[label_chars[i]]] = 1 # Build the model. We use a single RNN with a fully connected layer
# to compute the most likely predicted output char
HIDDEN_SIZE = 128
BATCH_SIZE = 128
NUM_ITERATIONS = 25
NUM_EPOCHS_PER_ITERATION = 1
NUM_PREDS_PER_EPOCH = 100 model = Sequential()
model.add(SimpleRNN(HIDDEN_SIZE, return_sequences=False,
input_shape=(SEQLEN, nb_chars),
unroll=True))
model.add(Dense(nb_chars))
model.add(Activation("softmax")) model.compile(loss="categorical_crossentropy", optimizer="rmsprop") # We train the model in batches and test output generated at each step
for iteration in range(NUM_ITERATIONS):
print("=" * 50)
print("Iteration #: %d" % (iteration))
model.fit(X, y, batch_size=BATCH_SIZE, epochs=NUM_EPOCHS_PER_ITERATION) # testing model
# randomly choose a row from input_chars, then use it to
# generate text from model for next 100 chars
test_idx = np.random.randint(len(input_chars))
test_chars = input_chars[test_idx]
print("Generating from seed: %s" % (test_chars))
print(test_chars, end="")
for i in range(NUM_PREDS_PER_EPOCH):
Xtest = np.zeros((1, SEQLEN, nb_chars))
for i, ch in enumerate(test_chars):
Xtest[0, i, char2index[ch]] = 1
pred = model.predict(Xtest, verbose=0)[0]
ypred = index2char[np.argmax(pred)]
print(ypred, end="")
# move forward with test_chars + ypred
test_chars = test_chars[1:] + ypred
print()

simpleRNN的更多相关文章

  1. 为什么使用 LSTM 训练速度远大于 SimpleRNN?

    今天试验 TensorFlow 2.x , Keras 的 SimpleRNN 和 LSTM,发现同样的输入.同样的超参数设置.同样的参数规模,LSTM 的训练时长竟然远少于 SimpleRNN. 模 ...

  2. Keras:基于Theano和TensorFlow的深度学习库

    catalogue . 引言 . 一些基本概念 . Sequential模型 . 泛型模型 . 常用层 . 卷积层 . 池化层 . 递归层Recurrent . 嵌入层 Embedding 1. 引言 ...

  3. “你什么意思”之基于RNN的语义槽填充(Pytorch实现)

    1. 概况 1.1 任务 口语理解(Spoken Language Understanding, SLU)作为语音识别与自然语言处理之间的一个新兴领域,其目的是为了让计算机从用户的讲话中理解他们的意图 ...

  4. (六) Keras 模型保存和RNN简单应用

    视频学习来源 https://www.bilibili.com/video/av40787141?from=search&seid=17003307842787199553 笔记 RNN用于图 ...

  5. 使用Keras搭建cnn+rnn, BRNN,DRNN等模型

    Keras api 提前知道: BatchNormalization, 用来加快每次迭代中的训练速度 Normalize the activations of the previous layer a ...

  6. Deep learning with Python 学习笔记(11)

    总结 机器学习(machine learning)是人工智能的一个特殊子领域,其目标是仅靠观察训练数据来自动开发程序[即模型(model)].将数据转换为程序的这个过程叫作学习(learning) 深 ...

  7. Word Embedding/RNN/LSTM

    Word Embedding Word Embedding是一种词的向量表示,比如,对于这样的"A B A C B F G"的一个序列,也许我们最后能得到:A对应的向量为[0.1 ...

  8. Keras学习笔记(完结)

    使用Keras中文文档学习 基本概念 Keras的核心数据结构是模型,也就是一种组织网络层的方式,最主要的是序贯模型(Sequential).创建好一个模型后就可以用add()向里面添加层.模型搭建完 ...

  9. Keras 中 TimeDistributed 和 TimeDistributedDense 理解

    From the offical code: class TimeDistributed(Wrapper): """This wrapper applies a laye ...

随机推荐

  1. [NOI2017]整数

    [NOI2017]整数 题目大意: \(n(n\le10^6)\)次操作维护一个长度为\(30n\)的二进制整数\(x\),支持以下两种操作: 将这个整数加上\(a\cdot2^b(|a|\le10^ ...

  2. Activity(活动)生命周期(1)--返回栈

    Android是使用任务(task)来管理活动的,一个任务就是一组存放在栈里的活动的集合,这个栈也被称为返回栈(Back stack).栈是一种后进先出的数据结构,在默认情况下,每当我们启动了一个新的 ...

  3. hdu2263Heavy Cargo

    #include <iostream> #include <cstdio> #include <algorithm> #include <queue>/ ...

  4. 手机在线更新系统MySQL数据库服务器参数优化mycnf,16G内存8核CPU,

    业务场景: 后台支持手机在线更新系统,db服务器内存16G,8核,dell的pc服务器. qps: 200个左右 tps: 1个左右 一分钟50几个 sort_buffer_size = 32M 大了 ...

  5. php中文件上传需要注意的几点

    1.首先要开启php.ini中的文件上传,打开php.ini 配置文件,查找 File Uploads ,在这个区域有以下3个选项: ;;;;;;;;;;;;;;;; ; File Uploads ; ...

  6. google的开源项目总结

    转自http://www.feng5166.com/blog/424.html google的开源项目值得我们一用的,这些项目很有意义,甚至可以直接用在我们自己的工作上!学习编程的的一个比较好的方式就 ...

  7. jquer回显选中select下拉框

    公司使用的框架比较旧,没有使用el等表达式. <% String context = request.getContextPath(); String index = (String)reque ...

  8. latex 三个不同的图放在一行且每个图都有注释

    \begin{figure}[htbp] \begin{minipage}[t]{0.3\linewidth} \centering \includegraphics[width=.2.0.eps} ...

  9. [转]SQL 2005 开启OpenRowset/OpenDatasource的办法

    本文转自:http://www.cnblogs.com/chenghm2003/archive/2008/09/12/1289793.html 1.开始 —> 所有程序  —> Micro ...

  10. 推送代码分支时出现:fatal: 'origin' does not appear to be a git repository

    关于ubuntu进行提交本地分支到远程库出现问题: 解决方案: 执行如下命令: git remote add origin git@github.com:yourusername/test.git y ...