简单地训练一个四层全连接网络

Ref: http://machinelearningmastery.com/tutorial-first-neural-network-python-keras/

1. Load Data

数据简介:Pima Indians Diabetes Data Set

下载  :Data download --> 保存为:pima-indians-diabetes.csv

from keras.models import Sequential
from keras.layers import Dense
import numpy
# fix random seed for reproducibility
seed = 7
numpy.random.seed(seed)
# 1.播了个随机种子 # load pima indians dataset
dataset = ("pima-indians-diabetes.csv", delimiter=",")
# split into input (X) and output (Y) variables
X = dataset[:,0:8]
Y = dataset[:,8]
# 2.读取了数据放在了二维数组

2. Define Model

目标:采用四层全连接网络

思路:采用Sequential model,然后一次添加一个layers:

    • 参数1:一层网络的结点个数    【第一层(input) 8个 --> 第二层 12个 --> 第三层 8个 --> 第四层(output) 1个】
    • 参数2:初始化方法         【0-0.05均匀分布】
    • 参数3:激活函数            【第二、三层:relu;第四层:sigmoid】
# create model
model = Sequential()
model.add(Dense(12, input_dim=8, init='uniform', activation='relu'))
model.add(Dense(8, init='uniform', activation='relu'))
model.add(Dense(1, init='uniform', activation='sigmoid'))

3. Compile Model

我们的目的:找较好的权重w来做预测。

# Compile model
model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])

Finally, because it is a classification problem, we will collect and report the classification accuracy as the metric.

Otherwise,见<6. load model>,也可直接导入现成模型,继续训练。

4. Fit Model

开始训练数据,监督学习:

    • 循环次数:epoch
    • 批统计量:batch
# Fit the model
history_callback = model.fit(X, Y, nb_epoch=150, batch_size=10)

5. Evaluate Model

训练后,使用model.evaluate(...)预测成功率统计:

# evaluate the model
scores = model.evaluate(X, Y)
print("%s: %.2f%%" % (model.metrics_names[1], scores[1]*100))

运行结果:acc: 78.91%

Epoch 1/150
768/768 [==============================] - 0s - loss: 0.6826 - acc: 0.6328
Epoch 2/150
768/768 [==============================] - 0s - loss: 0.6590 - acc: 0.6510
Epoch 3/150
768/768 [==============================] - 0s - loss: 0.6475 - acc: 0.6549
Epoch 4/150
768/768 [==============================] - 0s - loss: 0.6416 - acc: 0.6615
Epoch 5/150
768/768 [==============================] - 0s - loss: 0.6216 - acc: 0.6745
Epoch 6/150
768/768 [==============================] - 0s - loss: 0.6128 - acc: 0.6680
Epoch 7/150
768/768 [==============================] - 0s - loss: 0.6018 - acc: 0.6927
Epoch 8/150
768/768 [==============================] - 0s - loss: 0.5962 - acc: 0.6927
Epoch 9/150
768/768 [==============================] - 0s - loss: 0.5991 - acc: 0.6953
Epoch 10/150
768/768 [==============================] - 0s - loss: 0.5920 - acc: 0.6927
Epoch 11/150
768/768 [==============================] - 0s - loss: 0.5905 - acc: 0.6979
Epoch 12/150
768/768 [==============================] - 0s - loss: 0.5883 - acc: 0.6901
Epoch 13/150
768/768 [==============================] - 0s - loss: 0.5870 - acc: 0.6953
Epoch 14/150
768/768 [==============================] - 0s - loss: 0.5869 - acc: 0.6836
Epoch 15/150
768/768 [==============================] - 0s - loss: 0.5815 - acc: 0.6953
Epoch 16/150
768/768 [==============================] - 0s - loss: 0.5779 - acc: 0.6966
Epoch 17/150
768/768 [==============================] - 0s - loss: 0.5809 - acc: 0.6849
Epoch 18/150
768/768 [==============================] - 0s - loss: 0.5818 - acc: 0.6953
Epoch 19/150
768/768 [==============================] - 0s - loss: 0.5814 - acc: 0.6901
Epoch 20/150
768/768 [==============================] - 0s - loss: 0.5748 - acc: 0.7096
Epoch 21/150
768/768 [==============================] - 0s - loss: 0.5758 - acc: 0.7005
Epoch 22/150
768/768 [==============================] - 0s - loss: 0.5739 - acc: 0.7135
Epoch 23/150
768/768 [==============================] - 0s - loss: 0.5736 - acc: 0.6927
Epoch 24/150
768/768 [==============================] - 0s - loss: 0.5750 - acc: 0.6940
Epoch 25/150
768/768 [==============================] - 0s - loss: 0.5734 - acc: 0.7031
Epoch 26/150
768/768 [==============================] - 0s - loss: 0.5683 - acc: 0.7083
Epoch 27/150
768/768 [==============================] - 0s - loss: 0.5688 - acc: 0.7018
Epoch 28/150
768/768 [==============================] - 0s - loss: 0.5714 - acc: 0.7070
Epoch 29/150
768/768 [==============================] - 0s - loss: 0.5621 - acc: 0.7188
Epoch 30/150
768/768 [==============================] - 0s - loss: 0.5647 - acc: 0.7122
Epoch 31/150
768/768 [==============================] - 0s - loss: 0.5630 - acc: 0.7135
Epoch 32/150
768/768 [==============================] - 0s - loss: 0.5613 - acc: 0.7214
Epoch 33/150
768/768 [==============================] - 0s - loss: 0.5594 - acc: 0.7188
Epoch 34/150
768/768 [==============================] - 0s - loss: 0.5598 - acc: 0.7187
Epoch 35/150
768/768 [==============================] - 0s - loss: 0.5624 - acc: 0.7187
Epoch 36/150
768/768 [==============================] - 0s - loss: 0.5615 - acc: 0.7201
Epoch 37/150
768/768 [==============================] - 0s - loss: 0.5544 - acc: 0.7214
Epoch 38/150
768/768 [==============================] - 0s - loss: 0.5529 - acc: 0.7135
Epoch 39/150
768/768 [==============================] - 0s - loss: 0.5550 - acc: 0.7227
Epoch 40/150
768/768 [==============================] - 0s - loss: 0.5574 - acc: 0.7331
Epoch 41/150
768/768 [==============================] - 0s - loss: 0.5561 - acc: 0.7357
Epoch 42/150
768/768 [==============================] - 0s - loss: 0.5459 - acc: 0.7370
Epoch 43/150
768/768 [==============================] - 0s - loss: 0.5481 - acc: 0.7240
Epoch 44/150
768/768 [==============================] - 0s - loss: 0.5409 - acc: 0.7331
Epoch 45/150
768/768 [==============================] - 0s - loss: 0.5438 - acc: 0.7422
Epoch 46/150
768/768 [==============================] - 0s - loss: 0.5360 - acc: 0.7344
Epoch 47/150
768/768 [==============================] - 0s - loss: 0.5393 - acc: 0.7357
Epoch 48/150
768/768 [==============================] - 0s - loss: 0.5360 - acc: 0.7435
Epoch 49/150
768/768 [==============================] - 0s - loss: 0.5407 - acc: 0.7370
Epoch 50/150
768/768 [==============================] - 0s - loss: 0.5473 - acc: 0.7344
Epoch 51/150
768/768 [==============================] - 0s - loss: 0.5287 - acc: 0.7448
Epoch 52/150
768/768 [==============================] - 0s - loss: 0.5283 - acc: 0.7539
Epoch 53/150
768/768 [==============================] - 0s - loss: 0.5308 - acc: 0.7396
Epoch 54/150
768/768 [==============================] - 0s - loss: 0.5274 - acc: 0.7448
Epoch 55/150
768/768 [==============================] - 0s - loss: 0.5241 - acc: 0.7539
Epoch 56/150
768/768 [==============================] - 0s - loss: 0.5262 - acc: 0.7526
Epoch 57/150
768/768 [==============================] - 0s - loss: 0.5272 - acc: 0.7422
Epoch 58/150
768/768 [==============================] - 0s - loss: 0.5262 - acc: 0.7539
Epoch 59/150
768/768 [==============================] - 0s - loss: 0.5224 - acc: 0.7604
Epoch 60/150
768/768 [==============================] - 0s - loss: 0.5200 - acc: 0.7513
Epoch 61/150
768/768 [==============================] - 0s - loss: 0.5158 - acc: 0.7578
Epoch 62/150
768/768 [==============================] - 0s - loss: 0.5162 - acc: 0.7513
Epoch 63/150
768/768 [==============================] - 0s - loss: 0.5097 - acc: 0.7552
Epoch 64/150
768/768 [==============================] - 0s - loss: 0.5134 - acc: 0.7487
Epoch 65/150
768/768 [==============================] - 0s - loss: 0.5112 - acc: 0.7435
Epoch 66/150
768/768 [==============================] - 0s - loss: 0.5141 - acc: 0.7656
Epoch 67/150
768/768 [==============================] - 0s - loss: 0.5082 - acc: 0.7539
Epoch 68/150
768/768 [==============================] - 0s - loss: 0.5101 - acc: 0.7643
Epoch 69/150
768/768 [==============================] - 0s - loss: 0.5136 - acc: 0.7409
Epoch 70/150
768/768 [==============================] - 0s - loss: 0.5182 - acc: 0.7474
Epoch 71/150
768/768 [==============================] - 0s - loss: 0.5185 - acc: 0.7370
Epoch 72/150
768/768 [==============================] - 0s - loss: 0.5073 - acc: 0.7539
Epoch 73/150
768/768 [==============================] - 0s - loss: 0.4982 - acc: 0.7682
Epoch 74/150
768/768 [==============================] - 0s - loss: 0.4967 - acc: 0.7591
Epoch 75/150
768/768 [==============================] - 0s - loss: 0.5070 - acc: 0.7617
Epoch 76/150
768/768 [==============================] - 0s - loss: 0.5025 - acc: 0.7526
Epoch 77/150
768/768 [==============================] - 0s - loss: 0.4991 - acc: 0.7604
Epoch 78/150
768/768 [==============================] - 0s - loss: 0.4923 - acc: 0.7656
Epoch 79/150
768/768 [==============================] - 0s - loss: 0.4998 - acc: 0.7695
Epoch 80/150
768/768 [==============================] - 0s - loss: 0.5004 - acc: 0.7526
Epoch 81/150
768/768 [==============================] - 0s - loss: 0.5043 - acc: 0.7552
Epoch 82/150
768/768 [==============================] - 0s - loss: 0.5002 - acc: 0.7656
Epoch 83/150
768/768 [==============================] - 0s - loss: 0.4932 - acc: 0.7617
Epoch 84/150
768/768 [==============================] - 0s - loss: 0.4971 - acc: 0.7604
Epoch 85/150
768/768 [==============================] - 0s - loss: 0.5007 - acc: 0.7513
Epoch 86/150
768/768 [==============================] - 0s - loss: 0.4889 - acc: 0.7656
Epoch 87/150
768/768 [==============================] - 0s - loss: 0.4953 - acc: 0.7591
Epoch 88/150
768/768 [==============================] - 0s - loss: 0.4910 - acc: 0.7669
Epoch 89/150
768/768 [==============================] - 0s - loss: 0.4897 - acc: 0.7604
Epoch 90/150
768/768 [==============================] - 0s - loss: 0.4867 - acc: 0.7643
Epoch 91/150
768/768 [==============================] - 0s - loss: 0.4915 - acc: 0.7669
Epoch 92/150
768/768 [==============================] - 0s - loss: 0.4907 - acc: 0.7630
Epoch 93/150
768/768 [==============================] - 0s - loss: 0.4912 - acc: 0.7604
Epoch 94/150
768/768 [==============================] - 0s - loss: 0.4851 - acc: 0.7630
Epoch 95/150
768/768 [==============================] - 0s - loss: 0.4821 - acc: 0.7682
Epoch 96/150
768/768 [==============================] - 0s - loss: 0.4835 - acc: 0.7669
Epoch 97/150
768/768 [==============================] - 0s - loss: 0.4738 - acc: 0.7773
Epoch 98/150
768/768 [==============================] - 0s - loss: 0.5008 - acc: 0.7474
Epoch 99/150
768/768 [==============================] - 0s - loss: 0.4841 - acc: 0.7682
Epoch 100/150
768/768 [==============================] - 0s - loss: 0.4816 - acc: 0.7669
Epoch 101/150
768/768 [==============================] - 0s - loss: 0.4843 - acc: 0.7695
Epoch 102/150
768/768 [==============================] - 0s - loss: 0.4753 - acc: 0.7891
Epoch 103/150
768/768 [==============================] - 0s - loss: 0.4841 - acc: 0.7630
Epoch 104/150
768/768 [==============================] - 0s - loss: 0.4836 - acc: 0.7786
Epoch 105/150
768/768 [==============================] - 0s - loss: 0.4809 - acc: 0.7708
Epoch 106/150
768/768 [==============================] - 0s - loss: 0.4792 - acc: 0.7786
Epoch 107/150
768/768 [==============================] - 0s - loss: 0.4831 - acc: 0.7734
Epoch 108/150
768/768 [==============================] - 0s - loss: 0.4783 - acc: 0.7852
Epoch 109/150
768/768 [==============================] - 0s - loss: 0.4784 - acc: 0.7708
Epoch 110/150
768/768 [==============================] - 0s - loss: 0.4803 - acc: 0.7682
Epoch 111/150
768/768 [==============================] - 0s - loss: 0.4704 - acc: 0.7734
Epoch 112/150
768/768 [==============================] - 0s - loss: 0.4752 - acc: 0.7878
Epoch 113/150
768/768 [==============================] - 0s - loss: 0.4776 - acc: 0.7760
Epoch 114/150
768/768 [==============================] - 0s - loss: 0.4849 - acc: 0.7604
Epoch 115/150
768/768 [==============================] - 0s - loss: 0.4773 - acc: 0.7682
Epoch 116/150
768/768 [==============================] - 0s - loss: 0.4712 - acc: 0.7773
Epoch 117/150
768/768 [==============================] - 0s - loss: 0.4675 - acc: 0.7786
Epoch 118/150
768/768 [==============================] - 0s - loss: 0.4660 - acc: 0.7839
Epoch 119/150
768/768 [==============================] - 0s - loss: 0.4702 - acc: 0.7891
Epoch 120/150
768/768 [==============================] - 0s - loss: 0.4699 - acc: 0.7852
Epoch 121/150
768/768 [==============================] - 0s - loss: 0.4786 - acc: 0.7852
Epoch 122/150
768/768 [==============================] - 0s - loss: 0.4745 - acc: 0.7786
Epoch 123/150
768/768 [==============================] - 0s - loss: 0.4684 - acc: 0.7839
Epoch 124/150
768/768 [==============================] - 0s - loss: 0.4709 - acc: 0.7760
Epoch 125/150
768/768 [==============================] - 0s - loss: 0.4699 - acc: 0.7747
Epoch 126/150
768/768 [==============================] - 0s - loss: 0.4649 - acc: 0.7747
Epoch 127/150
768/768 [==============================] - 0s - loss: 0.4709 - acc: 0.7708
Epoch 128/150
768/768 [==============================] - 0s - loss: 0.4573 - acc: 0.7982
Epoch 129/150
768/768 [==============================] - 0s - loss: 0.4646 - acc: 0.7943
Epoch 130/150
768/768 [==============================] - 0s - loss: 0.4775 - acc: 0.7773
Epoch 131/150
768/768 [==============================] - 0s - loss: 0.4613 - acc: 0.7799
Epoch 132/150
768/768 [==============================] - 0s - loss: 0.4608 - acc: 0.7799
Epoch 133/150
768/768 [==============================] - 0s - loss: 0.4737 - acc: 0.7826
Epoch 134/150
768/768 [==============================] - 0s - loss: 0.4711 - acc: 0.7773
Epoch 135/150
768/768 [==============================] - 0s - loss: 0.4665 - acc: 0.7839
Epoch 136/150
768/768 [==============================] - 0s - loss: 0.4579 - acc: 0.7969
Epoch 137/150
768/768 [==============================] - 0s - loss: 0.4621 - acc: 0.7917
Epoch 138/150
768/768 [==============================] - 0s - loss: 0.4684 - acc: 0.7760
Epoch 139/150
768/768 [==============================] - 0s - loss: 0.4597 - acc: 0.7839
Epoch 140/150
768/768 [==============================] - 0s - loss: 0.4593 - acc: 0.7799
Epoch 141/150
768/768 [==============================] - 0s - loss: 0.4624 - acc: 0.7799
Epoch 142/150
768/768 [==============================] - 0s - loss: 0.4609 - acc: 0.7786
Epoch 143/150
768/768 [==============================] - 0s - loss: 0.4648 - acc: 0.7826
Epoch 144/150
768/768 [==============================] - 0s - loss: 0.4541 - acc: 0.8060
Epoch 145/150
768/768 [==============================] - 0s - loss: 0.4597 - acc: 0.7852
Epoch 146/150
768/768 [==============================] - 0s - loss: 0.4639 - acc: 0.7891
Epoch 147/150
768/768 [==============================] - 0s - loss: 0.4548 - acc: 0.7865
Epoch 148/150
768/768 [==============================] - 0s - loss: 0.4659 - acc: 0.7786
Epoch 149/150
768/768 [==============================] - 0s - loss: 0.4596 - acc: 0.7799
Epoch 150/150
768/768 [==============================] - 0s - loss: 0.4615 - acc: 0.7773
32/768 [>.............................] - ETA: 0sacc: 78.91%

log

6. Save & load model

分析logHow to log Keras loss output to a file

loss_history = history_callback.history["loss"]
acc_history = history_callback.history["acc"]

Save and Load Your Keras Deep Learning Models

    • 模型:model.json (option: json or yaml格式)
    • 权重:model.h5
# serialize model to JSON
model_json = model.to_json()
with open("model.json", "w+") as json_file:
json_file.write(model_json)
# serialize weights to HDF5
model.save_weights("model.h5")
print("Saved model to disk") # later... # load json and create model
json_file = open('model.json', 'r')
loaded_model_json = json_file.read()
json_file.close()
loaded_model = model_from_json(loaded_model_json)
# load weights into new model
loaded_model.load_weights("model.h5")
print("Loaded model from disk")
# serialize model to YAML
model_yaml = model.to_yaml()
with open("model.yaml", "w") as yaml_file:
yaml_file.write(model_yaml)
# serialize weights to HDF5
model.save_weights("model.h5")
print("Saved model to disk") # later... # load YAML and create model
yaml_file = open('model.yaml', 'r')
loaded_model_yaml = yaml_file.read()
yaml_file.close()
loaded_model = model_from_yaml(loaded_model_yaml)
# load weights into new model
loaded_model.load_weights("model.h5")
print("Loaded model from disk")

7. Make Predictions

通过numpy.loadtxt(...) 获取新的数据,放入X中。

# calculate predictions
predictions = model.predict(X)
# round predictions
rounded = [int(round(x[0])) for x in predictions]
print(rounded)

附加题:Multilayer Perceptron (理解图与代码的对应关系)

Code: a Multilayer Perceptron

import numpy as np
np.random.seed(1337) # for reproducibility import os
from keras.datasets import mnist    #自动下载 from keras.models import Sequential 
from keras.layers.core import Dense, Dropout, Activation
from keras.optimizers import RMSprop
from keras.utils import np_utils batch_size = 128 #Number of images used in each optimization step
nb_classes = 10 #One class per digit
nb_epoch = 12 #Number of times the whole data is used to learn

(X_train, y_train), (X_test, y_test) = mnist.load_data() #Flatten the data: 神经网络不用二位数组作为数据,所以这里变为一维
X_train = X_train.reshape(60000, 784)
X_test = X_test.reshape(10000, 784) #Make the value floats in [0;1] instead of int in [0;255] --> [归一化]
X_train = X_train.astype('float32')
X_test = X_test.astype('float32')
X_train /= 255
X_test /= 255 #Display the shapes to check if everything's ok
print(X_train.shape[0], 'train samples')
print(X_test.shape[0], 'test samples') # convert class vectors to binary class matrices (ie one-hot vectors)
Y_train = np_utils.to_categorical(y_train, nb_classes)
Y_test = np_utils.to_categorical(y_test, nb_classes) #Define the model achitecture
model = Sequential()
########################################################################################
model.add(Dense(512, input_shape=(784,)))
model.add(Activation('relu'))
model.add(Dropout(0.2))
model.add(Dense(512))
model.add(Activation('relu'))
model.add(Dropout(0.2))
model.add(Dense(10)) #Last layer with one output per class 二值化表示
model.add(Activation('softmax')) #We want a score simlar to a probability for each class
########################################################################################
#Use rmsprop to do the gradient descent see http://www.cs.toronto.edu/~tijmen/csc321/slides/lecture_slides_lec6.pdf
#and http://cs231n.github.io/neural-networks-3/#ada
rms = RMSprop()  # 随机梯度下降
#The function to optimize is the cross entropy between the true label and the output (softmax) of the model
model.compile(loss='categorical_crossentropy', optimizer=rms, metrics=["accuracy"])

#Make the model learn --> [Training]
model.fit(X_train, Y_train,
batch_size=batch_size, nb_epoch=nb_epoch,
verbose=2,
validation_data=(X_test, Y_test)) #Evaluate how the model does on the test set
score = model.evaluate(X_test, Y_test, verbose=0) print('Test score:', score[0])
print('Test accuracy:', score[1])

[Keras] Develop Neural Network With Keras Step-By-Step的更多相关文章

  1. [Python Debug]Kernel Crash While Running Neural Network with Keras|Jupyter Notebook运行Keras服务器宕机原因及解决方法

    最近做Machine Learning作业,要在Jupyter Notebook上用Keras搭建Neural Network.结果连最简单的一层神经网络都运行不了,更奇怪的是我先用iris数据集跑了 ...

  2. 课程一(Neural Networks and Deep Learning),第四周(Deep Neural Networks)——2.Programming Assignments: Building your Deep Neural Network: Step by Step

    Building your Deep Neural Network: Step by Step Welcome to your third programming exercise of the de ...

  3. 课程五(Sequence Models),第一 周(Recurrent Neural Networks) —— 1.Programming assignments:Building a recurrent neural network - step by step

    Building your Recurrent Neural Network - Step by Step Welcome to Course 5's first assignment! In thi ...

  4. Neural Networks and Deep Learning(week4)Building your Deep Neural Network: Step by Step

    Building your Deep Neural Network: Step by Step 你将使用下面函数来构建一个深层神经网络来实现图像分类. 使用像relu这的非线性单元来改进你的模型 构建 ...

  5. Sequence Models Week 1 Building a recurrent neural network - step by step

    Building your Recurrent Neural Network - Step by Step Welcome to Course 5's first assignment! In thi ...

  6. (转)LSTM NEURAL NETWORK FOR TIME SERIES PREDICTION

    LSTM NEURAL NETWORK FOR TIME SERIES PREDICTION Wed 21st Dec 2016   Neural Networks these days are th ...

  7. 深度神经网络如何看待你,论自拍What a Deep Neural Network thinks about your #selfie

    Convolutional Neural Networks are great: they recognize things, places and people in your personal p ...

  8. 通过Visualizing Representations来理解Deep Learning、Neural network、以及输入样本自身的高维空间结构

    catalogue . 引言 . Neural Networks Transform Space - 神经网络内部的空间结构 . Understand the data itself by visua ...

  9. (zhuan) LSTM Neural Network for Time Series Prediction

    LSTM Neural Network for Time Series Prediction Wed 21st Dec 2016 Neural Networks these days are the ...

随机推荐

  1. JAVA语言中的修饰符

    JAVA语言中的修饰符 -----------------------------------------------01--------------------------------------- ...

  2. LeetCode: 3Sum

    Given an array S of n integers, are there elements a, b, c in S such that a + b + c = 0? Find all un ...

  3. C#基础篇 - 正则表达式入门

    1.基本概念 正则表达式(Regular Expression)就是用事先定义好的一些特定字符(元字符)或普通字符.及这些字符的组合,组成一个“规则字符串”,这个“规则字符串”用来判断我们给定的字符串 ...

  4. 做一个gulp+webpack+vue的单页应用开发架子

    1.目标 最近项目上的事情不多,根据我自己的开发习惯,决定开发一些简单的开发架子,方便以后事情多的时候直接套用.本文讲的一个gulp+webpack+vue的单页应用架子,想要达到的目的: 可以通过命 ...

  5. C#中如何在Excel工作表创建混合型图表

    在进行图表分析的时候,我们可能需要在一张图表呈现两个或多个样式的图表,以便更加清晰.直观地查看不同的数据大小和变化趋势.在这篇文章中,我将分享C#中如何在一张图表中创建不同的图表类型,其中包括如何在同 ...

  6. [笔记]kubernetes 无法启动问题

    在启动kubernetes的时候报错误. ERROR: timed out for http://localhost:4001/v2/keys/ 原因是无法启动etcd, etcd 监听4001本地端 ...

  7. bzoj3207--Hash+主席树

    题目大意: 给定一个n个数的序列和m个询问(n,m<=100000)和k,每个询问包含k+2个数字:l,r,b[1],b[2]...b[k],要求输出b[1]~b[k]在[l,r]中是否出现. ...

  8. JavaScript中String对象的方法介绍

    1.字符方法 1.1 charAt() 方法,返回字符串中指定位置的字符. var question = "Do you like JavaScript?"; alert(ques ...

  9. 14门Linux课程,打通你Linux的任督二脉!

    Linux有很多优点:安全.自主.开源--,也正是这些优点使得很多人都在学Linux. 虽说网上有大把的Linux课程资源,但是对很多小白来说网上的课程资源比较零散并不适合新手学习. 正因为此,总结了 ...

  10. eclipse,myeclipse 误删文件,回滚历史文件操作

    昨天因为误操作把一个写了一上午的代码给删了,找到的这个,以前竟然还没发现有这个功能- -! 具体操作: 1.建立同路径同名的文件 2.文件上右键 --> Compare With --> ...