简单地训练一个四层全连接网络

Ref: http://machinelearningmastery.com/tutorial-first-neural-network-python-keras/

1. Load Data

数据简介:Pima Indians Diabetes Data Set

下载  :Data download --> 保存为:pima-indians-diabetes.csv

from keras.models import Sequential
from keras.layers import Dense
import numpy
# fix random seed for reproducibility
seed = 7
numpy.random.seed(seed)
# 1.播了个随机种子 # load pima indians dataset
dataset = ("pima-indians-diabetes.csv", delimiter=",")
# split into input (X) and output (Y) variables
X = dataset[:,0:8]
Y = dataset[:,8]
# 2.读取了数据放在了二维数组

2. Define Model

目标:采用四层全连接网络

思路:采用Sequential model,然后一次添加一个layers:

    • 参数1:一层网络的结点个数    【第一层(input) 8个 --> 第二层 12个 --> 第三层 8个 --> 第四层(output) 1个】
    • 参数2:初始化方法         【0-0.05均匀分布】
    • 参数3:激活函数            【第二、三层:relu;第四层:sigmoid】
# create model
model = Sequential()
model.add(Dense(12, input_dim=8, init='uniform', activation='relu'))
model.add(Dense(8, init='uniform', activation='relu'))
model.add(Dense(1, init='uniform', activation='sigmoid'))

3. Compile Model

我们的目的:找较好的权重w来做预测。

# Compile model
model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])

Finally, because it is a classification problem, we will collect and report the classification accuracy as the metric.

Otherwise,见<6. load model>,也可直接导入现成模型,继续训练。

4. Fit Model

开始训练数据,监督学习:

    • 循环次数:epoch
    • 批统计量:batch
# Fit the model
history_callback = model.fit(X, Y, nb_epoch=150, batch_size=10)

5. Evaluate Model

训练后,使用model.evaluate(...)预测成功率统计:

# evaluate the model
scores = model.evaluate(X, Y)
print("%s: %.2f%%" % (model.metrics_names[1], scores[1]*100))

运行结果:acc: 78.91%

Epoch 1/150
768/768 [==============================] - 0s - loss: 0.6826 - acc: 0.6328
Epoch 2/150
768/768 [==============================] - 0s - loss: 0.6590 - acc: 0.6510
Epoch 3/150
768/768 [==============================] - 0s - loss: 0.6475 - acc: 0.6549
Epoch 4/150
768/768 [==============================] - 0s - loss: 0.6416 - acc: 0.6615
Epoch 5/150
768/768 [==============================] - 0s - loss: 0.6216 - acc: 0.6745
Epoch 6/150
768/768 [==============================] - 0s - loss: 0.6128 - acc: 0.6680
Epoch 7/150
768/768 [==============================] - 0s - loss: 0.6018 - acc: 0.6927
Epoch 8/150
768/768 [==============================] - 0s - loss: 0.5962 - acc: 0.6927
Epoch 9/150
768/768 [==============================] - 0s - loss: 0.5991 - acc: 0.6953
Epoch 10/150
768/768 [==============================] - 0s - loss: 0.5920 - acc: 0.6927
Epoch 11/150
768/768 [==============================] - 0s - loss: 0.5905 - acc: 0.6979
Epoch 12/150
768/768 [==============================] - 0s - loss: 0.5883 - acc: 0.6901
Epoch 13/150
768/768 [==============================] - 0s - loss: 0.5870 - acc: 0.6953
Epoch 14/150
768/768 [==============================] - 0s - loss: 0.5869 - acc: 0.6836
Epoch 15/150
768/768 [==============================] - 0s - loss: 0.5815 - acc: 0.6953
Epoch 16/150
768/768 [==============================] - 0s - loss: 0.5779 - acc: 0.6966
Epoch 17/150
768/768 [==============================] - 0s - loss: 0.5809 - acc: 0.6849
Epoch 18/150
768/768 [==============================] - 0s - loss: 0.5818 - acc: 0.6953
Epoch 19/150
768/768 [==============================] - 0s - loss: 0.5814 - acc: 0.6901
Epoch 20/150
768/768 [==============================] - 0s - loss: 0.5748 - acc: 0.7096
Epoch 21/150
768/768 [==============================] - 0s - loss: 0.5758 - acc: 0.7005
Epoch 22/150
768/768 [==============================] - 0s - loss: 0.5739 - acc: 0.7135
Epoch 23/150
768/768 [==============================] - 0s - loss: 0.5736 - acc: 0.6927
Epoch 24/150
768/768 [==============================] - 0s - loss: 0.5750 - acc: 0.6940
Epoch 25/150
768/768 [==============================] - 0s - loss: 0.5734 - acc: 0.7031
Epoch 26/150
768/768 [==============================] - 0s - loss: 0.5683 - acc: 0.7083
Epoch 27/150
768/768 [==============================] - 0s - loss: 0.5688 - acc: 0.7018
Epoch 28/150
768/768 [==============================] - 0s - loss: 0.5714 - acc: 0.7070
Epoch 29/150
768/768 [==============================] - 0s - loss: 0.5621 - acc: 0.7188
Epoch 30/150
768/768 [==============================] - 0s - loss: 0.5647 - acc: 0.7122
Epoch 31/150
768/768 [==============================] - 0s - loss: 0.5630 - acc: 0.7135
Epoch 32/150
768/768 [==============================] - 0s - loss: 0.5613 - acc: 0.7214
Epoch 33/150
768/768 [==============================] - 0s - loss: 0.5594 - acc: 0.7188
Epoch 34/150
768/768 [==============================] - 0s - loss: 0.5598 - acc: 0.7187
Epoch 35/150
768/768 [==============================] - 0s - loss: 0.5624 - acc: 0.7187
Epoch 36/150
768/768 [==============================] - 0s - loss: 0.5615 - acc: 0.7201
Epoch 37/150
768/768 [==============================] - 0s - loss: 0.5544 - acc: 0.7214
Epoch 38/150
768/768 [==============================] - 0s - loss: 0.5529 - acc: 0.7135
Epoch 39/150
768/768 [==============================] - 0s - loss: 0.5550 - acc: 0.7227
Epoch 40/150
768/768 [==============================] - 0s - loss: 0.5574 - acc: 0.7331
Epoch 41/150
768/768 [==============================] - 0s - loss: 0.5561 - acc: 0.7357
Epoch 42/150
768/768 [==============================] - 0s - loss: 0.5459 - acc: 0.7370
Epoch 43/150
768/768 [==============================] - 0s - loss: 0.5481 - acc: 0.7240
Epoch 44/150
768/768 [==============================] - 0s - loss: 0.5409 - acc: 0.7331
Epoch 45/150
768/768 [==============================] - 0s - loss: 0.5438 - acc: 0.7422
Epoch 46/150
768/768 [==============================] - 0s - loss: 0.5360 - acc: 0.7344
Epoch 47/150
768/768 [==============================] - 0s - loss: 0.5393 - acc: 0.7357
Epoch 48/150
768/768 [==============================] - 0s - loss: 0.5360 - acc: 0.7435
Epoch 49/150
768/768 [==============================] - 0s - loss: 0.5407 - acc: 0.7370
Epoch 50/150
768/768 [==============================] - 0s - loss: 0.5473 - acc: 0.7344
Epoch 51/150
768/768 [==============================] - 0s - loss: 0.5287 - acc: 0.7448
Epoch 52/150
768/768 [==============================] - 0s - loss: 0.5283 - acc: 0.7539
Epoch 53/150
768/768 [==============================] - 0s - loss: 0.5308 - acc: 0.7396
Epoch 54/150
768/768 [==============================] - 0s - loss: 0.5274 - acc: 0.7448
Epoch 55/150
768/768 [==============================] - 0s - loss: 0.5241 - acc: 0.7539
Epoch 56/150
768/768 [==============================] - 0s - loss: 0.5262 - acc: 0.7526
Epoch 57/150
768/768 [==============================] - 0s - loss: 0.5272 - acc: 0.7422
Epoch 58/150
768/768 [==============================] - 0s - loss: 0.5262 - acc: 0.7539
Epoch 59/150
768/768 [==============================] - 0s - loss: 0.5224 - acc: 0.7604
Epoch 60/150
768/768 [==============================] - 0s - loss: 0.5200 - acc: 0.7513
Epoch 61/150
768/768 [==============================] - 0s - loss: 0.5158 - acc: 0.7578
Epoch 62/150
768/768 [==============================] - 0s - loss: 0.5162 - acc: 0.7513
Epoch 63/150
768/768 [==============================] - 0s - loss: 0.5097 - acc: 0.7552
Epoch 64/150
768/768 [==============================] - 0s - loss: 0.5134 - acc: 0.7487
Epoch 65/150
768/768 [==============================] - 0s - loss: 0.5112 - acc: 0.7435
Epoch 66/150
768/768 [==============================] - 0s - loss: 0.5141 - acc: 0.7656
Epoch 67/150
768/768 [==============================] - 0s - loss: 0.5082 - acc: 0.7539
Epoch 68/150
768/768 [==============================] - 0s - loss: 0.5101 - acc: 0.7643
Epoch 69/150
768/768 [==============================] - 0s - loss: 0.5136 - acc: 0.7409
Epoch 70/150
768/768 [==============================] - 0s - loss: 0.5182 - acc: 0.7474
Epoch 71/150
768/768 [==============================] - 0s - loss: 0.5185 - acc: 0.7370
Epoch 72/150
768/768 [==============================] - 0s - loss: 0.5073 - acc: 0.7539
Epoch 73/150
768/768 [==============================] - 0s - loss: 0.4982 - acc: 0.7682
Epoch 74/150
768/768 [==============================] - 0s - loss: 0.4967 - acc: 0.7591
Epoch 75/150
768/768 [==============================] - 0s - loss: 0.5070 - acc: 0.7617
Epoch 76/150
768/768 [==============================] - 0s - loss: 0.5025 - acc: 0.7526
Epoch 77/150
768/768 [==============================] - 0s - loss: 0.4991 - acc: 0.7604
Epoch 78/150
768/768 [==============================] - 0s - loss: 0.4923 - acc: 0.7656
Epoch 79/150
768/768 [==============================] - 0s - loss: 0.4998 - acc: 0.7695
Epoch 80/150
768/768 [==============================] - 0s - loss: 0.5004 - acc: 0.7526
Epoch 81/150
768/768 [==============================] - 0s - loss: 0.5043 - acc: 0.7552
Epoch 82/150
768/768 [==============================] - 0s - loss: 0.5002 - acc: 0.7656
Epoch 83/150
768/768 [==============================] - 0s - loss: 0.4932 - acc: 0.7617
Epoch 84/150
768/768 [==============================] - 0s - loss: 0.4971 - acc: 0.7604
Epoch 85/150
768/768 [==============================] - 0s - loss: 0.5007 - acc: 0.7513
Epoch 86/150
768/768 [==============================] - 0s - loss: 0.4889 - acc: 0.7656
Epoch 87/150
768/768 [==============================] - 0s - loss: 0.4953 - acc: 0.7591
Epoch 88/150
768/768 [==============================] - 0s - loss: 0.4910 - acc: 0.7669
Epoch 89/150
768/768 [==============================] - 0s - loss: 0.4897 - acc: 0.7604
Epoch 90/150
768/768 [==============================] - 0s - loss: 0.4867 - acc: 0.7643
Epoch 91/150
768/768 [==============================] - 0s - loss: 0.4915 - acc: 0.7669
Epoch 92/150
768/768 [==============================] - 0s - loss: 0.4907 - acc: 0.7630
Epoch 93/150
768/768 [==============================] - 0s - loss: 0.4912 - acc: 0.7604
Epoch 94/150
768/768 [==============================] - 0s - loss: 0.4851 - acc: 0.7630
Epoch 95/150
768/768 [==============================] - 0s - loss: 0.4821 - acc: 0.7682
Epoch 96/150
768/768 [==============================] - 0s - loss: 0.4835 - acc: 0.7669
Epoch 97/150
768/768 [==============================] - 0s - loss: 0.4738 - acc: 0.7773
Epoch 98/150
768/768 [==============================] - 0s - loss: 0.5008 - acc: 0.7474
Epoch 99/150
768/768 [==============================] - 0s - loss: 0.4841 - acc: 0.7682
Epoch 100/150
768/768 [==============================] - 0s - loss: 0.4816 - acc: 0.7669
Epoch 101/150
768/768 [==============================] - 0s - loss: 0.4843 - acc: 0.7695
Epoch 102/150
768/768 [==============================] - 0s - loss: 0.4753 - acc: 0.7891
Epoch 103/150
768/768 [==============================] - 0s - loss: 0.4841 - acc: 0.7630
Epoch 104/150
768/768 [==============================] - 0s - loss: 0.4836 - acc: 0.7786
Epoch 105/150
768/768 [==============================] - 0s - loss: 0.4809 - acc: 0.7708
Epoch 106/150
768/768 [==============================] - 0s - loss: 0.4792 - acc: 0.7786
Epoch 107/150
768/768 [==============================] - 0s - loss: 0.4831 - acc: 0.7734
Epoch 108/150
768/768 [==============================] - 0s - loss: 0.4783 - acc: 0.7852
Epoch 109/150
768/768 [==============================] - 0s - loss: 0.4784 - acc: 0.7708
Epoch 110/150
768/768 [==============================] - 0s - loss: 0.4803 - acc: 0.7682
Epoch 111/150
768/768 [==============================] - 0s - loss: 0.4704 - acc: 0.7734
Epoch 112/150
768/768 [==============================] - 0s - loss: 0.4752 - acc: 0.7878
Epoch 113/150
768/768 [==============================] - 0s - loss: 0.4776 - acc: 0.7760
Epoch 114/150
768/768 [==============================] - 0s - loss: 0.4849 - acc: 0.7604
Epoch 115/150
768/768 [==============================] - 0s - loss: 0.4773 - acc: 0.7682
Epoch 116/150
768/768 [==============================] - 0s - loss: 0.4712 - acc: 0.7773
Epoch 117/150
768/768 [==============================] - 0s - loss: 0.4675 - acc: 0.7786
Epoch 118/150
768/768 [==============================] - 0s - loss: 0.4660 - acc: 0.7839
Epoch 119/150
768/768 [==============================] - 0s - loss: 0.4702 - acc: 0.7891
Epoch 120/150
768/768 [==============================] - 0s - loss: 0.4699 - acc: 0.7852
Epoch 121/150
768/768 [==============================] - 0s - loss: 0.4786 - acc: 0.7852
Epoch 122/150
768/768 [==============================] - 0s - loss: 0.4745 - acc: 0.7786
Epoch 123/150
768/768 [==============================] - 0s - loss: 0.4684 - acc: 0.7839
Epoch 124/150
768/768 [==============================] - 0s - loss: 0.4709 - acc: 0.7760
Epoch 125/150
768/768 [==============================] - 0s - loss: 0.4699 - acc: 0.7747
Epoch 126/150
768/768 [==============================] - 0s - loss: 0.4649 - acc: 0.7747
Epoch 127/150
768/768 [==============================] - 0s - loss: 0.4709 - acc: 0.7708
Epoch 128/150
768/768 [==============================] - 0s - loss: 0.4573 - acc: 0.7982
Epoch 129/150
768/768 [==============================] - 0s - loss: 0.4646 - acc: 0.7943
Epoch 130/150
768/768 [==============================] - 0s - loss: 0.4775 - acc: 0.7773
Epoch 131/150
768/768 [==============================] - 0s - loss: 0.4613 - acc: 0.7799
Epoch 132/150
768/768 [==============================] - 0s - loss: 0.4608 - acc: 0.7799
Epoch 133/150
768/768 [==============================] - 0s - loss: 0.4737 - acc: 0.7826
Epoch 134/150
768/768 [==============================] - 0s - loss: 0.4711 - acc: 0.7773
Epoch 135/150
768/768 [==============================] - 0s - loss: 0.4665 - acc: 0.7839
Epoch 136/150
768/768 [==============================] - 0s - loss: 0.4579 - acc: 0.7969
Epoch 137/150
768/768 [==============================] - 0s - loss: 0.4621 - acc: 0.7917
Epoch 138/150
768/768 [==============================] - 0s - loss: 0.4684 - acc: 0.7760
Epoch 139/150
768/768 [==============================] - 0s - loss: 0.4597 - acc: 0.7839
Epoch 140/150
768/768 [==============================] - 0s - loss: 0.4593 - acc: 0.7799
Epoch 141/150
768/768 [==============================] - 0s - loss: 0.4624 - acc: 0.7799
Epoch 142/150
768/768 [==============================] - 0s - loss: 0.4609 - acc: 0.7786
Epoch 143/150
768/768 [==============================] - 0s - loss: 0.4648 - acc: 0.7826
Epoch 144/150
768/768 [==============================] - 0s - loss: 0.4541 - acc: 0.8060
Epoch 145/150
768/768 [==============================] - 0s - loss: 0.4597 - acc: 0.7852
Epoch 146/150
768/768 [==============================] - 0s - loss: 0.4639 - acc: 0.7891
Epoch 147/150
768/768 [==============================] - 0s - loss: 0.4548 - acc: 0.7865
Epoch 148/150
768/768 [==============================] - 0s - loss: 0.4659 - acc: 0.7786
Epoch 149/150
768/768 [==============================] - 0s - loss: 0.4596 - acc: 0.7799
Epoch 150/150
768/768 [==============================] - 0s - loss: 0.4615 - acc: 0.7773
32/768 [>.............................] - ETA: 0sacc: 78.91%

log

6. Save & load model

分析logHow to log Keras loss output to a file

loss_history = history_callback.history["loss"]
acc_history = history_callback.history["acc"]

Save and Load Your Keras Deep Learning Models

    • 模型:model.json (option: json or yaml格式)
    • 权重:model.h5
# serialize model to JSON
model_json = model.to_json()
with open("model.json", "w+") as json_file:
json_file.write(model_json)
# serialize weights to HDF5
model.save_weights("model.h5")
print("Saved model to disk") # later... # load json and create model
json_file = open('model.json', 'r')
loaded_model_json = json_file.read()
json_file.close()
loaded_model = model_from_json(loaded_model_json)
# load weights into new model
loaded_model.load_weights("model.h5")
print("Loaded model from disk")
# serialize model to YAML
model_yaml = model.to_yaml()
with open("model.yaml", "w") as yaml_file:
yaml_file.write(model_yaml)
# serialize weights to HDF5
model.save_weights("model.h5")
print("Saved model to disk") # later... # load YAML and create model
yaml_file = open('model.yaml', 'r')
loaded_model_yaml = yaml_file.read()
yaml_file.close()
loaded_model = model_from_yaml(loaded_model_yaml)
# load weights into new model
loaded_model.load_weights("model.h5")
print("Loaded model from disk")

7. Make Predictions

通过numpy.loadtxt(...) 获取新的数据,放入X中。

# calculate predictions
predictions = model.predict(X)
# round predictions
rounded = [int(round(x[0])) for x in predictions]
print(rounded)

附加题:Multilayer Perceptron (理解图与代码的对应关系)

Code: a Multilayer Perceptron

import numpy as np
np.random.seed(1337) # for reproducibility import os
from keras.datasets import mnist    #自动下载 from keras.models import Sequential 
from keras.layers.core import Dense, Dropout, Activation
from keras.optimizers import RMSprop
from keras.utils import np_utils batch_size = 128 #Number of images used in each optimization step
nb_classes = 10 #One class per digit
nb_epoch = 12 #Number of times the whole data is used to learn

(X_train, y_train), (X_test, y_test) = mnist.load_data() #Flatten the data: 神经网络不用二位数组作为数据,所以这里变为一维
X_train = X_train.reshape(60000, 784)
X_test = X_test.reshape(10000, 784) #Make the value floats in [0;1] instead of int in [0;255] --> [归一化]
X_train = X_train.astype('float32')
X_test = X_test.astype('float32')
X_train /= 255
X_test /= 255 #Display the shapes to check if everything's ok
print(X_train.shape[0], 'train samples')
print(X_test.shape[0], 'test samples') # convert class vectors to binary class matrices (ie one-hot vectors)
Y_train = np_utils.to_categorical(y_train, nb_classes)
Y_test = np_utils.to_categorical(y_test, nb_classes) #Define the model achitecture
model = Sequential()
########################################################################################
model.add(Dense(512, input_shape=(784,)))
model.add(Activation('relu'))
model.add(Dropout(0.2))
model.add(Dense(512))
model.add(Activation('relu'))
model.add(Dropout(0.2))
model.add(Dense(10)) #Last layer with one output per class 二值化表示
model.add(Activation('softmax')) #We want a score simlar to a probability for each class
########################################################################################
#Use rmsprop to do the gradient descent see http://www.cs.toronto.edu/~tijmen/csc321/slides/lecture_slides_lec6.pdf
#and http://cs231n.github.io/neural-networks-3/#ada
rms = RMSprop()  # 随机梯度下降
#The function to optimize is the cross entropy between the true label and the output (softmax) of the model
model.compile(loss='categorical_crossentropy', optimizer=rms, metrics=["accuracy"])

#Make the model learn --> [Training]
model.fit(X_train, Y_train,
batch_size=batch_size, nb_epoch=nb_epoch,
verbose=2,
validation_data=(X_test, Y_test)) #Evaluate how the model does on the test set
score = model.evaluate(X_test, Y_test, verbose=0) print('Test score:', score[0])
print('Test accuracy:', score[1])

[Keras] Develop Neural Network With Keras Step-By-Step的更多相关文章

  1. [Python Debug]Kernel Crash While Running Neural Network with Keras|Jupyter Notebook运行Keras服务器宕机原因及解决方法

    最近做Machine Learning作业,要在Jupyter Notebook上用Keras搭建Neural Network.结果连最简单的一层神经网络都运行不了,更奇怪的是我先用iris数据集跑了 ...

  2. 课程一(Neural Networks and Deep Learning),第四周(Deep Neural Networks)——2.Programming Assignments: Building your Deep Neural Network: Step by Step

    Building your Deep Neural Network: Step by Step Welcome to your third programming exercise of the de ...

  3. 课程五(Sequence Models),第一 周(Recurrent Neural Networks) —— 1.Programming assignments:Building a recurrent neural network - step by step

    Building your Recurrent Neural Network - Step by Step Welcome to Course 5's first assignment! In thi ...

  4. Neural Networks and Deep Learning(week4)Building your Deep Neural Network: Step by Step

    Building your Deep Neural Network: Step by Step 你将使用下面函数来构建一个深层神经网络来实现图像分类. 使用像relu这的非线性单元来改进你的模型 构建 ...

  5. Sequence Models Week 1 Building a recurrent neural network - step by step

    Building your Recurrent Neural Network - Step by Step Welcome to Course 5's first assignment! In thi ...

  6. (转)LSTM NEURAL NETWORK FOR TIME SERIES PREDICTION

    LSTM NEURAL NETWORK FOR TIME SERIES PREDICTION Wed 21st Dec 2016   Neural Networks these days are th ...

  7. 深度神经网络如何看待你,论自拍What a Deep Neural Network thinks about your #selfie

    Convolutional Neural Networks are great: they recognize things, places and people in your personal p ...

  8. 通过Visualizing Representations来理解Deep Learning、Neural network、以及输入样本自身的高维空间结构

    catalogue . 引言 . Neural Networks Transform Space - 神经网络内部的空间结构 . Understand the data itself by visua ...

  9. (zhuan) LSTM Neural Network for Time Series Prediction

    LSTM Neural Network for Time Series Prediction Wed 21st Dec 2016 Neural Networks these days are the ...

随机推荐

  1. Azkaban源码学习笔记

    1. ConnectorParams (interface): 定义了各种常量参数,没有声明任何方法. 2. ExecutorServlet.java类   2.1 继承类HttpServlet和接口 ...

  2. 微信公众号开发之VS远程调试

    目录 (一)微信公众号开发之VS远程调试 (二)微信公众号开发之基础梳理 (三)微信公众号开发之自动消息回复和自定义菜单 前言 微信公众平台消息接口的工作原理大概可以这样理解:从用户端到公众号端一个流 ...

  3. MIP改造常见问题二十问

    在MIP推出后,我们收到了很多站长的疑问和顾虑.我们将所有疑问和顾虑归纳为以下二十个问题,希望对大家理解 MIP 有帮助. 1.MIP 化后对其他搜索引擎抓取收录以及 SEO 的影响如何? 答:在原页 ...

  4. 【微框架】之一:从零开始,轻松搞定SpringCloud微框架系列--开山篇(spring boot 小demo)

    Spring顶级框架有众多,那么接下的篇幅,我将重点讲解SpringCloud微框架的实现 Spring 顶级项目,包含众多,我们重点学习一下,SpringCloud项目以及SpringBoot项目 ...

  5. fir.im Weekly - 关于 iOS10 适配、开发、推送的一切

    "小程序"来了,微信变成名副其实的 Web OS,新一轮的Web App 与Native App争论四起.程序员对新技术永远保持灵敏的嗅觉和旺盛的好奇心,@李锦发整理了微信小程序资 ...

  6. python enumerate 用法

    A new built-in function, enumerate() , will make certain loops a bit clearer. enumerate(thing) , whe ...

  7. CSS中强悍的相对单位之em(em-and-elastic-layouts)学习小记

    使用相对单位em注意点 1.浏览器默认字体是16px,即1em = 16px,根元素设置如下 html{ font-size: 100%; /* WinIE text resize correctio ...

  8. css3更改input单选和多选的样式

    在项目开发中我们经常会遇到需要更改input单选和多选样式的情况,今天就给大家介绍一种简单改变input单选和多选样式的办法. 在这之前先简单介绍一下:before伪类 :before 选择器向选定的 ...

  9. kali linux下的arp攻击

    这是我第一篇博客,写的不好请谅解 ____________________________(分割线)_______________________________ 在kali linux系统下自带工具 ...

  10. FineReport:关于扩展行列求各种条件下的函数运用

    最简单的扩展列,扩展行的求"最大,最小,平均"值的例子 设计图 效果图 相关函数 =MAX(B2:E2) =MIN(B2:E2) =AVERAGE(B2:E2) 这个是(满足条件) ...