论文:《Fully Convolutional Networks for Semantic Segmentation》 
代码:FCN的Caffe 实现 
数据集:PascalVOC

一 数据集制作

PascalVOC数据下载下来后,制作用以图像分割的图像数据集和标签数据集,LMDB或者LEVELDB格式。 最好resize一下(填充的方式)。 
1. 数据文件夹构成 
包括原始图片和标签图片,如下。 
 

然后,构建对应的lmdb文件。可以将所有图片按照4:1的比例分为train:val的比例。每个txt文件列出图像路径就可以,不用给label,因为image的label还是image,在caffe中指定就行。

Img_train.txt
SegmentationImage/002120.png
SegmentationImage/002132.png
SegmentationImage/002142.png
SegmentationImage/002212.png
SegmentationImage/002234.png
SegmentationImage/002260.png
SegmentationImage/002266.png
SegmentationImage/002268.png
SegmentationImage/002273.png
SegmentationImage/002281.png
SegmentationImage/002284.png
SegmentationImage/002293.png
SegmentationImage/002361.png
Label_train.txt
SegmentationClass/002120.png
SegmentationClass/002132.png
SegmentationClass/002142.png
SegmentationClass/002212.png
SegmentationClass/002234.png
SegmentationClass/002260.png
SegmentationClass/002266.png
SegmentationClass/002268.png
SegmentationClass/002273.png
SegmentationClass/002281.png
SegmentationClass/002284.png
SegmentationClass/002293.png

注意:label要自己生成,根据SegmentationClass下的groundtruth图片。 每个类别的像素值如下:

类别名称 R G B
background 0 0 0 背景
aeroplane 128 0 0 飞机
bicycle 0 128 0
bird 128 128 0
boat 0 0 128
bottle 128 0 128 瓶子
bus 0 128 128 大巴
car 128 128 128
cat 64 0 0 猫
chair 192 0 0
cow 64 128 0
diningtable 192 128 0 餐桌
dog 64 0 128
horse 192 0 128
motorbike 64 128 128
person 192 128 128
pottedplant 0 64 0 盆栽
sheep 128 64 0
sofa 0 192 0
train 128 192 0
tvmonitor 0 64 128 显示器

对数据集中的 ground truth 图像进行处理,生成用以训练的label图像。 
需要注意的是,label文件要是gray格式,不然会出错:scores层输出与label的数据尺寸不一致,通道问题导致的。 
然后生成lmdb就行了。数据集准备完毕。 

二 网络模型定义

这里主要考虑的是数据输入的问题,指定data和label,如下。

 layer {
name: "data"
type: "Data"
top:"data"
include {
phase: TRAIN
}
transform_param {
scale: 0.00390625
mean_file: "G:/interest_of_imags_for_recognation/VOC2007/Resize224/Img_train_mean.binaryproto"
}
data_param {
source: "G:/interest_of_imags_for_recognation/VOC2007/Resize224/Img_train"
batch_size:
backend: LMDB
}
}
layer {
name: "label"
type: "Data"
top:"label"
include {
phase: TRAIN
}
transform_param {
scale: 0.00390625
mean_file: "G:/interest_of_imags_for_recognation/VOC2007/Resize224/Label_train_mean.binaryproto"
}
data_param {
source: "G:/interest_of_imags_for_recognation/VOC2007/Resize224/Label_train"
batch_size:
backend: LMDB
}
}
layer {
name: "data"
type: "Data"
top: "data"
include {
phase: TEST
}
transform_param {
scale: 0.00390625
mean_file: "G:/interest_of_imags_for_recognation/VOC2007/Resize224/Img_val_mean.binaryproto"
}
data_param {
source: "G:/interest_of_imags_for_recognation/VOC2007/Resize224/Img_val"
batch_size:
backend: LMDB
}
}
layer {
name: "label"
type: "Data"
top: "label"
include {
phase: TEST
}
transform_param {
scale: 0.00390625
mean_file: "G:/interest_of_imags_for_recognation/VOC2007/Resize224/Label_val_mean.binaryproto"
}
data_param {
source: "G:/interest_of_imags_for_recognation/VOC2007/Resize224/Label_val"
batch_size:
backend: LMDB
}
}

三 网络训练

最好fintune,不然loss下降太慢。

 Log file created at: // ::
Running on machine: DESKTOP
Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
I1213 ::07.177220 caffe.cpp:] Using GPUs
I1213 ::07.436894 caffe.cpp:] GPU : GeForce GTX
I1213 ::07.758122 common.cpp:] System entropy source not available, using fallback algorithm to generate seed instead.
I1213 ::07.758623 solver.cpp:] Initializing solver from parameters:
test_iter:
test_interval:
base_lr: 1e-
display:
max_iter:
lr_policy: "fixed"
momentum: 0.95
weight_decay: 0.0005
snapshot:
snapshot_prefix: "FCN"
solver_mode: GPU
device_id:
net: "train_val.prototxt"
train_state {
level:
stage: ""
}
iter_size:
I1213 ::07.759624 solver.cpp:] Creating training net from net file: train_val.prototxt
I1213 ::07.760124 net.cpp:] The NetState phase () differed from the phase () specified by a rule in layer data
I1213 ::07.760124 net.cpp:] The NetState phase () differed from the phase () specified by a rule in layer label
I1213 ::07.760124 net.cpp:] The NetState phase () differed from the phase () specified by a rule in layer accuracy
I1213 ::07.761126 net.cpp:] Initializing net from parameters:
state {
phase: TRAIN
level:
stage: ""
}
layer {
name: "data"
type: "Data"
top: "data"
include {
phase: TRAIN
}
transform_param {
scale: 0.00390625
mean_file: "G:/interest_of_imags_for_recognation/VOC2007/Resize224/Img_train_mean.binaryproto"
}
data_param {
source: "G:/interest_of_imags_for_recognation/VOC2007/Resize224/Img_train"
batch_size:
backend: LMDB
}
}
layer {
name: "label"
type: "Data"
top: "label"
include {
phase: TRAIN
}
transform_param {
scale: 0.00390625
mean_file: "G:/interest_of_imags_for_recognation/VOC2007/Resize224/Label_train_mean.binaryproto"
}
data_param {
source: "G:/interest_of_imags_for_recognation/VOC2007/Resize224/Label_train"
batch_size:
backend: LMDB
}
}
layer {
name: "conv1_1"
type: "Convolution"
bottom: "data"
top: "conv1_1"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
pad:
kernel_size:
stride:
}
}
layer {
name: "relu1_1"
type: "ReLU"
bottom: "conv1_1"
top: "conv1_1"
}
layer {
name: "conv1_2"
type: "Convolution"
bottom: "conv1_1"
top: "conv1_2"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
pad:
kernel_size:
stride:
}
}
layer {
name: "relu1_2"
type: "ReLU"
bottom: "conv1_2"
top: "conv1_2"
}
layer {
name: "pool1"
type: "Pooling"
bottom: "conv1_2"
top: "pool1"
pooling_param {
pool: MAX
kernel_size:
stride:
}
}
layer {
name: "conv2_1"
type: "Convolution"
bottom: "pool1"
top: "conv2_1"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
pad:
kernel_size:
stride:
}
}
layer {
name: "relu2_1"
type: "ReLU"
bottom: "conv2_1"
top: "conv2_1"
}
layer {
name: "conv2_2"
type: "Convolution"
bottom: "conv2_1"
top: "conv2_2"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
pad:
kernel_size:
stride:
}
}
layer {
name: "relu2_2"
type: "ReLU"
bottom: "conv2_2"
top: "conv2_2"
}
layer {
name: "pool2"
type: "Pooling"
bottom: "conv2_2"
top: "pool2"
pooling_param {
pool: MAX
kernel_size:
stride:
}
}
layer {
name: "conv3_1"
type: "Convolution"
bottom: "pool2"
top: "conv3_1"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
pad:
kernel_size:
stride:
}
}
layer {
name: "relu3_1"
type: "ReLU"
bottom: "conv3_1"
top: "conv3_1"
}
layer {
name: "conv3_2"
type: "Convolution"
bottom: "conv3_1"
top: "conv3_2"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
pad:
kernel_size:
stride:
}
}
layer {
name: "relu3_2"
type: "ReLU"
bottom: "conv3_2"
top: "conv3_2"
}
layer {
name: "conv3_3"
type: "Convolution"
bottom: "conv3_2"
top: "conv3_3"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
pad:
kernel_size:
stride:
}
}
layer {
name: "relu3_3"
type: "ReLU"
bottom: "conv3_3"
top: "conv3_3"
}
layer {
name: "pool3"
type: "Pooling"
bottom: "conv3_3"
top: "pool3"
pooling_param {
pool: MAX
kernel_size:
stride:
}
}
layer {
name: "conv4_1"
type: "Convolution"
bottom: "pool3"
top: "conv4_1"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
pad:
kernel_size:
stride:
}
}
layer {
name: "relu4_1"
type: "ReLU"
bottom: "conv4_1"
top: "conv4_1"
}
layer {
name: "conv4_2"
type: "Convolution"
bottom: "conv4_1"
top: "conv4_2"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
pad:
kernel_size:
stride:
}
}
layer {
name: "relu4_2"
type: "ReLU"
bottom: "conv4_2"
top: "conv4_2"
}
layer {
name: "conv4_3"
type: "Convolution"
bottom: "conv4_2"
top: "conv4_3"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
pad:
kernel_size:
stride:
}
}
layer {
name: "relu4_3"
type: "ReLU"
bottom: "conv4_3"
top: "conv4_3"
}
layer {
name: "pool4"
type: "Pooling"
bottom: "conv4_3"
top: "pool4"
pooling_param {
pool: MAX
kernel_size:
stride:
}
}
layer {
name: "conv5_1"
type: "Convolution"
bottom: "pool4"
top: "conv5_1"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
pad:
kernel_size:
stride:
}
}
layer {
name: "relu5_1"
type: "ReLU"
bottom: "conv5_1"
top: "conv5_1"
}
layer {
name: "conv5_2"
type: "Convolution"
bottom: "conv5_1"
top: "conv5_2"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
pad:
kernel_size:
stride:
}
}
layer {
name: "relu5_2"
type: "ReLU"
bottom: "conv5_2"
top: "conv5_2"
}
layer {
name: "conv5_3"
type: "Convolution"
bottom: "conv5_2"
top: "conv5_3"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
pad:
kernel_size:
stride:
}
}
layer {
name: "relu5_3"
type: "ReLU"
bottom: "conv5_3"
top: "conv5_3"
}
layer {
name: "pool5"
type: "Pooling"
bottom: "conv5_3"
top: "pool5"
pooling_param {
pool: MAX
kernel_size:
stride:
}
}
layer {
name: "fc6"
type: "Convolution"
bottom: "pool5"
top: "fc6"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
pad:
kernel_size:
stride:
}
}
layer {
name: "relu6"
type: "ReLU"
bottom: "fc6"
top: "fc6"
}
layer {
name: "drop6"
type: "Dropout"
bottom: "fc6"
top: "fc6"
dropout_param {
dropout_ratio: 0.5
}
}
layer {
name: "fc7"
type: "Convolution"
bottom: "fc6"
top: "fc7"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
pad:
kernel_size:
stride:
}
}
layer {
name: "relu7"
type: "ReLU"
bottom: "fc7"
top: "fc7"
}
layer {
name: "drop7"
type: "Dropout"
bottom: "fc7"
top: "fc7"
dropout_param {
dropout_ratio: 0.5
}
}
layer {
name: "score_fr"
type: "Convolution"
bottom: "fc7"
top: "score_fr"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
pad:
kernel_size:
}
}
layer {
name: "upscore2"
type: "Deconvolution"
bottom: "score_fr"
top: "upscore2"
param {
lr_mult:
}
convolution_param {
num_output:
bias_term: false
kernel_size:
stride:
}
}
layer {
name: "score_pool4"
type: "Convolution"
bottom: "pool4"
top: "score_pool4"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
pad:
kernel_size:
}
}
layer {
name: "score_pool4c"
type: "Crop"
bottom: "score_pool4"
bottom: "upscore2"
top: "score_pool4c"
crop_param {
axis:
offset:
}
}
layer {
name: "fuse_pool4"
type: "Eltwise"
bottom: "upscore2"
bottom: "score_pool4c"
top: "fuse_pool4"
eltwise_param {
operation: SUM
}
}
layer {
name: "upscore_pool4"
type: "Deconvolution"
bottom: "fuse_pool4"
top: "upscore_pool4"
param {
lr_mult:
}
convolution_param {
num_output:
bias_term: false
kernel_size:
stride:
}
}
layer {
name: "score_pool3"
type: "Convolution"
bottom: "pool3"
top: "score_pool3"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
pad:
kernel_size:
}
}
layer {
name: "score_pool3c"
type: "Crop"
bottom: "score_pool3"
bottom: "upscore_pool4"
top: "score_pool3c"
crop_param {
axis:
offset:
}
}
layer {
name: "fuse_pool3"
type: "Eltwise"
bottom: "upscore_pool4"
bottom: "score_pool3c"
top: "fuse_pool3"
eltwise_param {
operation: SUM
}
}
layer {
name: "upscore8"
type: "Deconvolution"
bottom: "fuse_pool3"
top: "upscore8"
param {
lr_mult:
}
convolution_param {
num_output:
bias_term: false
kernel_size:
stride:
}
}
layer {
name: "score"
type: "Crop"
bottom: "upscore8"
bottom: "data"
top: "score"
crop_param {
axis:
offset:
}
}
layer {
name: "loss"
type: "SoftmaxWithLoss"
bottom: "score"
bottom: "label"
top: "loss"
loss_param {
ignore_label:
normalize: false
}
}
I1213 ::07.787643 layer_factory.hpp:] Creating layer data
I1213 ::07.788645 common.cpp:] System entropy source not available, using fallback algorithm to generate seed instead.
I1213 ::07.789145 net.cpp:] Creating Layer data
I1213 ::07.789645 net.cpp:] data -> data
I1213 ::07.790145 common.cpp:] System entropy source not available, using fallback algorithm to generate seed instead.
I1213 ::07.790145 data_transformer.cpp:] Loading mean file from: G:/interest_of_imags_for_recognation/VOC2007/Resize224/Img_train_mean.binaryproto
I1213 ::07.791647 db_lmdb.cpp:] Opened lmdb G:/interest_of_imags_for_recognation/VOC2007/Resize224/Img_train
I1213 ::07.841182 data_layer.cpp:] output data size: ,,,
I1213 ::07.846186 net.cpp:] Setting up data
I1213 ::07.846688 net.cpp:] Top shape: ()
I1213 ::07.849189 common.cpp:] System entropy source not available, using fallback algorithm to generate seed instead.
I1213 ::07.849689 net.cpp:] Memory required for data:
I1213 ::07.852190 layer_factory.hpp:] Creating layer data_data_0_split
I1213 ::07.853691 net.cpp:] Creating Layer data_data_0_split
I1213 ::07.855195 net.cpp:] data_data_0_split <- data
I1213 ::07.856194 net.cpp:] data_data_0_split -> data_data_0_split_0
I1213 ::07.857697 net.cpp:] data_data_0_split -> data_data_0_split_1
I1213 ::07.858695 net.cpp:] Setting up data_data_0_split
I1213 ::07.859695 net.cpp:] Top shape: ()
I1213 ::07.862702 net.cpp:] Top shape: ()
I1213 ::07.864199 net.cpp:] Memory required for data:
I1213 ::07.865211 layer_factory.hpp:] Creating layer label
I1213 ::07.866701 net.cpp:] Creating Layer label
I1213 ::07.867712 net.cpp:] label -> label
I1213 ::07.869706 common.cpp:] System entropy source not available, using fallback algorithm to generate seed instead.
I1213 ::07.870203 data_transformer.cpp:] Loading mean file from: G:/interest_of_imags_for_recognation/VOC2007/Resize224/Label_train_mean.binaryproto
I1213 ::07.873206 db_lmdb.cpp:] Opened lmdb G:/interest_of_imags_for_recognation/VOC2007/Resize224/Label_train
I1213 ::07.875710 data_layer.cpp:] output data size: ,,,
I1213 ::07.877709 net.cpp:] Setting up label
I1213 ::07.879212 net.cpp:] Top shape: ()
I1213 ::07.881211 common.cpp:] System entropy source not available, using fallback algorithm to generate seed instead.
I1213 ::07.882211 net.cpp:] Memory required for data:
I1213 ::07.883713 layer_factory.hpp:] Creating layer conv1_1
I1213 ::07.884716 net.cpp:] Creating Layer conv1_1
I1213 ::07.885215 net.cpp:] conv1_1 <- data_data_0_split_0
I1213 ::07.886214 net.cpp:] conv1_1 -> conv1_1
I1213 ::08.172420 net.cpp:] Setting up conv1_1
I1213 ::08.172919 net.cpp:] Top shape: ()
I1213 ::08.173419 net.cpp:] Memory required for data:
I1213 ::08.173919 layer_factory.hpp:] Creating layer relu1_1
I1213 ::08.173919 net.cpp:] Creating Layer relu1_1
I1213 ::08.173919 net.cpp:] relu1_1 <- conv1_1
I1213 ::08.174420 net.cpp:] relu1_1 -> conv1_1 (in-place)
I1213 ::08.174921 net.cpp:] Setting up relu1_1
I1213 ::08.175420 net.cpp:] Top shape: ()
I1213 ::08.175921 net.cpp:] Memory required for data:
I1213 ::08.175921 layer_factory.hpp:] Creating layer conv1_2
I1213 ::08.176421 net.cpp:] Creating Layer conv1_2
I1213 ::08.176421 net.cpp:] conv1_2 <- conv1_1
I1213 ::08.176421 net.cpp:] conv1_2 -> conv1_2
I1213 ::08.178923 net.cpp:] Setting up conv1_2
I1213 ::08.179424 net.cpp:] Top shape: ()
I1213 ::08.179424 net.cpp:] Memory required for data:
I1213 ::08.179424 layer_factory.hpp:] Creating layer relu1_2
I1213 ::08.179424 net.cpp:] Creating Layer relu1_2
I1213 ::08.180424 net.cpp:] relu1_2 <- conv1_2
I1213 ::08.180424 net.cpp:] relu1_2 -> conv1_2 (in-place)
I1213 ::08.180924 net.cpp:] Setting up relu1_2
I1213 ::08.181426 net.cpp:] Top shape: ()
I1213 ::08.181426 net.cpp:] Memory required for data:
I1213 ::08.181426 layer_factory.hpp:] Creating layer pool1
I1213 ::08.181426 net.cpp:] Creating Layer pool1
I1213 ::08.182425 net.cpp:] pool1 <- conv1_2
I1213 ::08.182425 net.cpp:] pool1 -> pool1
I1213 ::08.182425 net.cpp:] Setting up pool1
I1213 ::08.183426 net.cpp:] Top shape: ()
I1213 ::08.183426 net.cpp:] Memory required for data:
I1213 ::08.183426 layer_factory.hpp:] Creating layer conv2_1
I1213 ::08.183926 net.cpp:] Creating Layer conv2_1
I1213 ::08.183926 net.cpp:] conv2_1 <- pool1
I1213 ::08.183926 net.cpp:] conv2_1 -> conv2_1
I1213 ::08.189931 net.cpp:] Setting up conv2_1
I1213 ::08.189931 net.cpp:] Top shape: ()
I1213 ::08.190433 net.cpp:] Memory required for data:
I1213 ::08.190932 layer_factory.hpp:] Creating layer relu2_1
I1213 ::08.191432 net.cpp:] Creating Layer relu2_1
I1213 ::08.191432 net.cpp:] relu2_1 <- conv2_1
I1213 ::08.191432 net.cpp:] relu2_1 -> conv2_1 (in-place)
I1213 ::08.192433 net.cpp:] Setting up relu2_1
I1213 ::08.192934 net.cpp:] Top shape: ()
I1213 ::08.192934 net.cpp:] Memory required for data:
I1213 ::08.193434 layer_factory.hpp:] Creating layer conv2_2
I1213 ::08.193434 net.cpp:] Creating Layer conv2_2
I1213 ::08.194434 net.cpp:] conv2_2 <- conv2_1
I1213 ::08.194434 net.cpp:] conv2_2 -> conv2_2
I1213 ::08.197937 net.cpp:] Setting up conv2_2
I1213 ::08.197937 net.cpp:] Top shape: ()
I1213 ::08.198437 net.cpp:] Memory required for data:
I1213 ::08.198437 layer_factory.hpp:] Creating layer relu2_2
I1213 ::08.198437 net.cpp:] Creating Layer relu2_2
I1213 ::08.199439 net.cpp:] relu2_2 <- conv2_2
I1213 ::08.199439 net.cpp:] relu2_2 -> conv2_2 (in-place)
I1213 ::08.199939 net.cpp:] Setting up relu2_2
I1213 ::08.200939 net.cpp:] Top shape: ()
I1213 ::08.200939 net.cpp:] Memory required for data:
I1213 ::08.200939 layer_factory.hpp:] Creating layer pool2
I1213 ::08.200939 net.cpp:] Creating Layer pool2
I1213 ::08.202940 net.cpp:] pool2 <- conv2_2
I1213 ::08.203441 net.cpp:] pool2 -> pool2
I1213 ::08.203441 net.cpp:] Setting up pool2
I1213 ::08.203441 net.cpp:] Top shape: ()
I1213 ::08.203441 net.cpp:] Memory required for data:
I1213 ::08.203941 layer_factory.hpp:] Creating layer conv3_1
I1213 ::08.203941 net.cpp:] Creating Layer conv3_1
I1213 ::08.203941 net.cpp:] conv3_1 <- pool2
I1213 ::08.207443 net.cpp:] conv3_1 -> conv3_1
I1213 ::08.214949 net.cpp:] Setting up conv3_1
I1213 ::08.214949 net.cpp:] Top shape: ()
I1213 ::08.215450 net.cpp:] Memory required for data:
I1213 ::08.215450 layer_factory.hpp:] Creating layer relu3_1
I1213 ::08.215450 net.cpp:] Creating Layer relu3_1
I1213 ::08.216450 net.cpp:] relu3_1 <- conv3_1
I1213 ::08.216450 net.cpp:] relu3_1 -> conv3_1 (in-place)
I1213 ::08.217952 net.cpp:] Setting up relu3_1
I1213 ::08.219452 net.cpp:] Top shape: ()
I1213 ::08.219452 net.cpp:] Memory required for data:
I1213 ::08.219452 layer_factory.hpp:] Creating layer conv3_2
I1213 ::08.220453 net.cpp:] Creating Layer conv3_2
I1213 ::08.222455 net.cpp:] conv3_2 <- conv3_1
I1213 ::08.222455 net.cpp:] conv3_2 -> conv3_2
I1213 ::08.227458 net.cpp:] Setting up conv3_2
I1213 ::08.227458 net.cpp:] Top shape: ()
I1213 ::08.228458 net.cpp:] Memory required for data:
I1213 ::08.228458 layer_factory.hpp:] Creating layer relu3_2
I1213 ::08.228458 net.cpp:] Creating Layer relu3_2
I1213 ::08.228958 net.cpp:] relu3_2 <- conv3_2
I1213 ::08.229960 net.cpp:] relu3_2 -> conv3_2 (in-place)
I1213 ::08.230460 net.cpp:] Setting up relu3_2
I1213 ::08.230959 net.cpp:] Top shape: ()
I1213 ::08.230959 net.cpp:] Memory required for data:
I1213 ::08.231461 layer_factory.hpp:] Creating layer conv3_3
I1213 ::08.231961 net.cpp:] Creating Layer conv3_3
I1213 ::08.231961 net.cpp:] conv3_3 <- conv3_2
I1213 ::08.231961 net.cpp:] conv3_3 -> conv3_3
I1213 ::08.240967 net.cpp:] Setting up conv3_3
I1213 ::08.240967 net.cpp:] Top shape: ()
I1213 ::08.241467 net.cpp:] Memory required for data:
I1213 ::08.241467 layer_factory.hpp:] Creating layer relu3_3
I1213 ::08.242468 net.cpp:] Creating Layer relu3_3
I1213 ::08.242468 net.cpp:] relu3_3 <- conv3_3
I1213 ::08.242468 net.cpp:] relu3_3 -> conv3_3 (in-place)
I1213 ::08.243969 net.cpp:] Setting up relu3_3
I1213 ::08.244971 net.cpp:] Top shape: ()
I1213 ::08.244971 net.cpp:] Memory required for data:
I1213 ::08.244971 layer_factory.hpp:] Creating layer pool3
I1213 ::08.245970 net.cpp:] Creating Layer pool3
I1213 ::08.245970 net.cpp:] pool3 <- conv3_3
I1213 ::08.245970 net.cpp:] pool3 -> pool3
I1213 ::08.245970 net.cpp:] Setting up pool3
I1213 ::08.246471 net.cpp:] Top shape: ()
I1213 ::08.246471 net.cpp:] Memory required for data:
I1213 ::08.246471 layer_factory.hpp:] Creating layer pool3_pool3_0_split
I1213 ::08.246471 net.cpp:] Creating Layer pool3_pool3_0_split
I1213 ::08.246971 net.cpp:] pool3_pool3_0_split <- pool3
I1213 ::08.247473 net.cpp:] pool3_pool3_0_split -> pool3_pool3_0_split_0
I1213 ::08.247473 net.cpp:] pool3_pool3_0_split -> pool3_pool3_0_split_1
I1213 ::08.247473 net.cpp:] Setting up pool3_pool3_0_split
I1213 ::08.247473 net.cpp:] Top shape: ()
I1213 ::08.247473 net.cpp:] Top shape: ()
I1213 ::08.247473 net.cpp:] Memory required for data:
I1213 ::08.249974 layer_factory.hpp:] Creating layer conv4_1
I1213 ::08.249974 net.cpp:] Creating Layer conv4_1
I1213 ::08.249974 net.cpp:] conv4_1 <- pool3_pool3_0_split_0
I1213 ::08.249974 net.cpp:] conv4_1 -> conv4_1
I1213 ::08.260982 net.cpp:] Setting up conv4_1
I1213 ::08.261482 net.cpp:] Top shape: ()
I1213 ::08.262984 net.cpp:] Memory required for data:
I1213 ::08.262984 layer_factory.hpp:] Creating layer relu4_1
I1213 ::08.266486 net.cpp:] Creating Layer relu4_1
I1213 ::08.266985 net.cpp:] relu4_1 <- conv4_1
I1213 ::08.266985 net.cpp:] relu4_1 -> conv4_1 (in-place)
I1213 ::08.269989 net.cpp:] Setting up relu4_1
I1213 ::08.269989 net.cpp:] Top shape: ()
I1213 ::08.270488 net.cpp:] Memory required for data:
I1213 ::08.270488 layer_factory.hpp:] Creating layer conv4_2
I1213 ::08.270488 net.cpp:] Creating Layer conv4_2
I1213 ::08.272990 net.cpp:] conv4_2 <- conv4_1
I1213 ::08.275492 net.cpp:] conv4_2 -> conv4_2
I1213 ::08.287000 net.cpp:] Setting up conv4_2
I1213 ::08.287000 net.cpp:] Top shape: ()
I1213 ::08.287500 net.cpp:] Memory required for data:
I1213 ::08.287500 layer_factory.hpp:] Creating layer relu4_2
I1213 ::08.287500 net.cpp:] Creating Layer relu4_2
I1213 ::08.288501 net.cpp:] relu4_2 <- conv4_2
I1213 ::08.288501 net.cpp:] relu4_2 -> conv4_2 (in-place)
I1213 ::08.289504 net.cpp:] Setting up relu4_2
I1213 ::08.290503 net.cpp:] Top shape: ()
I1213 ::08.290503 net.cpp:] Memory required for data:
I1213 ::08.290503 layer_factory.hpp:] Creating layer conv4_3
I1213 ::08.290503 net.cpp:] Creating Layer conv4_3
I1213 ::08.291503 net.cpp:] conv4_3 <- conv4_2
I1213 ::08.291503 net.cpp:] conv4_3 -> conv4_3
I1213 ::08.301012 net.cpp:] Setting up conv4_3
I1213 ::08.301512 net.cpp:] Top shape: ()
I1213 ::08.303011 net.cpp:] Memory required for data:
I1213 ::08.303011 layer_factory.hpp:] Creating layer relu4_3
I1213 ::08.306015 net.cpp:] Creating Layer relu4_3
I1213 ::08.307015 net.cpp:] relu4_3 <- conv4_3
I1213 ::08.307515 net.cpp:] relu4_3 -> conv4_3 (in-place)
I1213 ::08.309517 net.cpp:] Setting up relu4_3
I1213 ::08.312518 net.cpp:] Top shape: ()
I1213 ::08.312518 net.cpp:] Memory required for data:
I1213 ::08.313519 layer_factory.hpp:] Creating layer pool4
I1213 ::08.313519 net.cpp:] Creating Layer pool4
I1213 ::08.313519 net.cpp:] pool4 <- conv4_3
I1213 ::08.313519 net.cpp:] pool4 -> pool4
I1213 ::08.314019 net.cpp:] Setting up pool4
I1213 ::08.314019 net.cpp:] Top shape: ()
I1213 ::08.314019 net.cpp:] Memory required for data:
I1213 ::08.314019 layer_factory.hpp:] Creating layer pool4_pool4_0_split
I1213 ::08.314019 net.cpp:] Creating Layer pool4_pool4_0_split
I1213 ::08.315521 net.cpp:] pool4_pool4_0_split <- pool4
I1213 ::08.315521 net.cpp:] pool4_pool4_0_split -> pool4_pool4_0_split_0
I1213 ::08.315521 net.cpp:] pool4_pool4_0_split -> pool4_pool4_0_split_1
I1213 ::08.316522 net.cpp:] Setting up pool4_pool4_0_split
I1213 ::08.316522 net.cpp:] Top shape: ()
I1213 ::08.317023 net.cpp:] Top shape: ()
I1213 ::08.317023 net.cpp:] Memory required for data:
I1213 ::08.317023 layer_factory.hpp:] Creating layer conv5_1
I1213 ::08.317523 net.cpp:] Creating Layer conv5_1
I1213 ::08.318022 net.cpp:] conv5_1 <- pool4_pool4_0_split_0
I1213 ::08.318522 net.cpp:] conv5_1 -> conv5_1
I1213 ::08.326529 net.cpp:] Setting up conv5_1
I1213 ::08.327530 net.cpp:] Top shape: ()
I1213 ::08.327530 net.cpp:] Memory required for data:
I1213 ::08.327530 layer_factory.hpp:] Creating layer relu5_1
I1213 ::08.327530 net.cpp:] Creating Layer relu5_1
I1213 ::08.327530 net.cpp:] relu5_1 <- conv5_1
I1213 ::08.328531 net.cpp:] relu5_1 -> conv5_1 (in-place)
I1213 ::08.329530 net.cpp:] Setting up relu5_1
I1213 ::08.330030 net.cpp:] Top shape: ()
I1213 ::08.330030 net.cpp:] Memory required for data:
I1213 ::08.330030 layer_factory.hpp:] Creating layer conv5_2
I1213 ::08.330030 net.cpp:] Creating Layer conv5_2
I1213 ::08.331032 net.cpp:] conv5_2 <- conv5_1
I1213 ::08.331032 net.cpp:] conv5_2 -> conv5_2
I1213 ::08.339037 net.cpp:] Setting up conv5_2
I1213 ::08.339539 net.cpp:] Top shape: ()
I1213 ::08.339539 net.cpp:] Memory required for data:
I1213 ::08.339539 layer_factory.hpp:] Creating layer relu5_2
I1213 ::08.340538 net.cpp:] Creating Layer relu5_2
I1213 ::08.340538 net.cpp:] relu5_2 <- conv5_2
I1213 ::08.340538 net.cpp:] relu5_2 -> conv5_2 (in-place)
I1213 ::08.341539 net.cpp:] Setting up relu5_2
I1213 ::08.342039 net.cpp:] Top shape: ()
I1213 ::08.342039 net.cpp:] Memory required for data:
I1213 ::08.342039 layer_factory.hpp:] Creating layer conv5_3
I1213 ::08.342039 net.cpp:] Creating Layer conv5_3
I1213 ::08.342039 net.cpp:] conv5_3 <- conv5_2
I1213 ::08.342540 net.cpp:] conv5_3 -> conv5_3
I1213 ::08.348544 net.cpp:] Setting up conv5_3
I1213 ::08.348544 net.cpp:] Top shape: ()
I1213 ::08.349545 net.cpp:] Memory required for data:
I1213 ::08.349545 layer_factory.hpp:] Creating layer relu5_3
I1213 ::08.349545 net.cpp:] Creating Layer relu5_3
I1213 ::08.350545 net.cpp:] relu5_3 <- conv5_3
I1213 ::08.350545 net.cpp:] relu5_3 -> conv5_3 (in-place)
I1213 ::08.352046 net.cpp:] Setting up relu5_3
I1213 ::08.352547 net.cpp:] Top shape: ()
I1213 ::08.352547 net.cpp:] Memory required for data:
I1213 ::08.352547 layer_factory.hpp:] Creating layer pool5
I1213 ::08.352547 net.cpp:] Creating Layer pool5
I1213 ::08.353049 net.cpp:] pool5 <- conv5_3
I1213 ::08.353049 net.cpp:] pool5 -> pool5
I1213 ::08.353049 net.cpp:] Setting up pool5
I1213 ::08.353548 net.cpp:] Top shape: ()
I1213 ::08.353548 net.cpp:] Memory required for data:
I1213 ::08.354048 layer_factory.hpp:] Creating layer fc6
I1213 ::08.354048 net.cpp:] Creating Layer fc6
I1213 ::08.354048 net.cpp:] fc6 <- pool5
I1213 ::08.354048 net.cpp:] fc6 -> fc6
I1213 ::08.565698 net.cpp:] Setting up fc6
I1213 ::08.566198 net.cpp:] Top shape: ()
I1213 ::08.566699 net.cpp:] Memory required for data:
I1213 ::08.567199 layer_factory.hpp:] Creating layer relu6
I1213 ::08.567700 net.cpp:] Creating Layer relu6
I1213 ::08.567700 net.cpp:] relu6 <- fc6
I1213 ::08.568200 net.cpp:] relu6 -> fc6 (in-place)
I1213 ::08.568701 net.cpp:] Setting up relu6
I1213 ::08.569201 net.cpp:] Top shape: ()
I1213 ::08.569701 net.cpp:] Memory required for data:
I1213 ::08.569701 layer_factory.hpp:] Creating layer drop6
I1213 ::08.569701 net.cpp:] Creating Layer drop6
I1213 ::08.570201 net.cpp:] drop6 <- fc6
I1213 ::08.570703 net.cpp:] drop6 -> fc6 (in-place)
I1213 ::08.571703 net.cpp:] Setting up drop6
I1213 ::08.572703 net.cpp:] Top shape: ()
I1213 ::08.573204 net.cpp:] Memory required for data:
I1213 ::08.573204 layer_factory.hpp:] Creating layer fc7
I1213 ::08.573204 net.cpp:] Creating Layer fc7
I1213 ::08.573704 net.cpp:] fc7 <- fc6
I1213 ::08.574204 net.cpp:] fc7 -> fc7
I1213 ::08.610230 net.cpp:] Setting up fc7
I1213 ::08.610730 net.cpp:] Top shape: ()
I1213 ::08.611232 net.cpp:] Memory required for data:
I1213 ::08.611732 layer_factory.hpp:] Creating layer relu7
I1213 ::08.612232 net.cpp:] Creating Layer relu7
I1213 ::08.612232 net.cpp:] relu7 <- fc7
I1213 ::08.612232 net.cpp:] relu7 -> fc7 (in-place)
I1213 ::08.612732 net.cpp:] Setting up relu7
I1213 ::08.613232 net.cpp:] Top shape: ()
I1213 ::08.613232 net.cpp:] Memory required for data:
I1213 ::08.613232 layer_factory.hpp:] Creating layer drop7
I1213 ::08.613232 net.cpp:] Creating Layer drop7
I1213 ::08.613732 net.cpp:] drop7 <- fc7
I1213 ::08.614733 net.cpp:] drop7 -> fc7 (in-place)
I1213 ::08.615234 net.cpp:] Setting up drop7
I1213 ::08.615734 net.cpp:] Top shape: ()
I1213 ::08.616235 net.cpp:] Memory required for data:
I1213 ::08.616235 layer_factory.hpp:] Creating layer score_fr
I1213 ::08.616235 net.cpp:] Creating Layer score_fr
I1213 ::08.616235 net.cpp:] score_fr <- fc7
I1213 ::08.617235 net.cpp:] score_fr -> score_fr
I1213 ::08.619237 net.cpp:] Setting up score_fr
I1213 ::08.619237 net.cpp:] Top shape: ()
I1213 ::08.619237 net.cpp:] Memory required for data:
I1213 ::08.619237 layer_factory.hpp:] Creating layer upscore2
I1213 ::08.620237 net.cpp:] Creating Layer upscore2
I1213 ::08.620237 net.cpp:] upscore2 <- score_fr
I1213 ::08.620237 net.cpp:] upscore2 -> upscore2
I1213 ::08.621739 net.cpp:] Setting up upscore2
I1213 ::08.622740 net.cpp:] Top shape: ()
I1213 ::08.622740 net.cpp:] Memory required for data:
I1213 ::08.622740 layer_factory.hpp:] Creating layer upscore2_upscore2_0_split
I1213 ::08.622740 net.cpp:] Creating Layer upscore2_upscore2_0_split
I1213 ::08.623240 net.cpp:] upscore2_upscore2_0_split <- upscore2
I1213 ::08.623740 net.cpp:] upscore2_upscore2_0_split -> upscore2_upscore2_0_split_0
I1213 ::08.623740 net.cpp:] upscore2_upscore2_0_split -> upscore2_upscore2_0_split_1
I1213 ::08.623740 net.cpp:] Setting up upscore2_upscore2_0_split
I1213 ::08.627243 net.cpp:] Top shape: ()
I1213 ::08.628244 net.cpp:] Top shape: ()
I1213 ::08.628744 net.cpp:] Memory required for data:
I1213 ::08.628744 layer_factory.hpp:] Creating layer score_pool4
I1213 ::08.630245 net.cpp:] Creating Layer score_pool4
I1213 ::08.631748 net.cpp:] score_pool4 <- pool4_pool4_0_split_1
I1213 ::08.631748 net.cpp:] score_pool4 -> score_pool4
I1213 ::08.634748 net.cpp:] Setting up score_pool4
I1213 ::08.634748 net.cpp:] Top shape: ()
I1213 ::08.636250 net.cpp:] Memory required for data:
I1213 ::08.636250 layer_factory.hpp:] Creating layer score_pool4c
I1213 ::08.636250 net.cpp:] Creating Layer score_pool4c
I1213 ::08.636250 net.cpp:] score_pool4c <- score_pool4
I1213 ::08.637750 net.cpp:] score_pool4c <- upscore2_upscore2_0_split_0
I1213 ::08.637750 net.cpp:] score_pool4c -> score_pool4c
I1213 ::08.637750 net.cpp:] Setting up score_pool4c
I1213 ::08.638751 net.cpp:] Top shape: ()
I1213 ::08.638751 net.cpp:] Memory required for data:
I1213 ::08.639251 layer_factory.hpp:] Creating layer fuse_pool4
I1213 ::08.639251 net.cpp:] Creating Layer fuse_pool4
I1213 ::08.640751 net.cpp:] fuse_pool4 <- upscore2_upscore2_0_split_1
I1213 ::08.645756 net.cpp:] fuse_pool4 <- score_pool4c
I1213 ::08.645756 net.cpp:] fuse_pool4 -> fuse_pool4
I1213 ::08.646756 net.cpp:] Setting up fuse_pool4
I1213 ::08.646756 net.cpp:] Top shape: ()
I1213 ::08.646756 net.cpp:] Memory required for data:
I1213 ::08.646756 layer_factory.hpp:] Creating layer upscore_pool4
I1213 ::08.647258 net.cpp:] Creating Layer upscore_pool4
I1213 ::08.647258 net.cpp:] upscore_pool4 <- fuse_pool4
I1213 ::08.647758 net.cpp:] upscore_pool4 -> upscore_pool4
I1213 ::08.649258 net.cpp:] Setting up upscore_pool4
I1213 ::08.649760 net.cpp:] Top shape: ()
I1213 ::08.650259 net.cpp:] Memory required for data:
I1213 ::08.650259 layer_factory.hpp:] Creating layer upscore_pool4_upscore_pool4_0_split
I1213 ::08.650259 net.cpp:] Creating Layer upscore_pool4_upscore_pool4_0_split
I1213 ::08.651260 net.cpp:] upscore_pool4_upscore_pool4_0_split <- upscore_pool4
I1213 ::08.651260 net.cpp:] upscore_pool4_upscore_pool4_0_split -> upscore_pool4_upscore_pool4_0_split_0
I1213 ::08.651260 net.cpp:] upscore_pool4_upscore_pool4_0_split -> upscore_pool4_upscore_pool4_0_split_1
I1213 ::08.652261 net.cpp:] Setting up upscore_pool4_upscore_pool4_0_split
I1213 ::08.652261 net.cpp:] Top shape: ()
I1213 ::08.652261 net.cpp:] Top shape: ()
I1213 ::08.652261 net.cpp:] Memory required for data:
I1213 ::08.652261 layer_factory.hpp:] Creating layer score_pool3
I1213 ::08.653261 net.cpp:] Creating Layer score_pool3
I1213 ::08.656764 net.cpp:] score_pool3 <- pool3_pool3_0_split_1
I1213 ::08.656764 net.cpp:] score_pool3 -> score_pool3
I1213 ::08.659765 net.cpp:] Setting up score_pool3
I1213 ::08.660266 net.cpp:] Top shape: ()
I1213 ::08.666271 net.cpp:] Memory required for data:
I1213 ::08.666271 layer_factory.hpp:] Creating layer score_pool3c
I1213 ::08.666271 net.cpp:] Creating Layer score_pool3c
I1213 ::08.666771 net.cpp:] score_pool3c <- score_pool3
I1213 ::08.667271 net.cpp:] score_pool3c <- upscore_pool4_upscore_pool4_0_split_0
I1213 ::08.667271 net.cpp:] score_pool3c -> score_pool3c
I1213 ::08.667271 net.cpp:] Setting up score_pool3c
I1213 ::08.667271 net.cpp:] Top shape: ()
I1213 ::08.667271 net.cpp:] Memory required for data:
I1213 ::08.667271 layer_factory.hpp:] Creating layer fuse_pool3
I1213 ::08.667271 net.cpp:] Creating Layer fuse_pool3
I1213 ::08.668272 net.cpp:] fuse_pool3 <- upscore_pool4_upscore_pool4_0_split_1
I1213 ::08.668772 net.cpp:] fuse_pool3 <- score_pool3c
I1213 ::08.669273 net.cpp:] fuse_pool3 -> fuse_pool3
I1213 ::08.669273 net.cpp:] Setting up fuse_pool3
I1213 ::08.670274 net.cpp:] Top shape: ()
I1213 ::08.670274 net.cpp:] Memory required for data:
I1213 ::08.670274 layer_factory.hpp:] Creating layer upscore8
I1213 ::08.670274 net.cpp:] Creating Layer upscore8
I1213 ::08.670274 net.cpp:] upscore8 <- fuse_pool3
I1213 ::08.670274 net.cpp:] upscore8 -> upscore8
I1213 ::08.671274 net.cpp:] Setting up upscore8
I1213 ::08.671775 net.cpp:] Top shape: ()
I1213 ::08.671775 net.cpp:] Memory required for data:
I1213 ::08.671775 layer_factory.hpp:] Creating layer score
I1213 ::08.671775 net.cpp:] Creating Layer score
I1213 ::08.671775 net.cpp:] score <- upscore8
I1213 ::08.672274 net.cpp:] score <- data_data_0_split_1
I1213 ::08.673275 net.cpp:] score -> score
I1213 ::08.673275 net.cpp:] Setting up score
I1213 ::08.674276 net.cpp:] Top shape: ()
I1213 ::08.674276 net.cpp:] Memory required for data:
I1213 ::08.674276 layer_factory.hpp:] Creating layer loss
I1213 ::08.674276 net.cpp:] Creating Layer loss
I1213 ::08.675277 net.cpp:] loss <- score
I1213 ::08.675277 net.cpp:] loss <- label
I1213 ::08.675277 net.cpp:] loss -> loss
I1213 ::08.675277 layer_factory.hpp:] Creating layer loss
I1213 ::08.678781 net.cpp:] Setting up loss
I1213 ::08.679280 net.cpp:] Top shape: ()
I1213 ::08.679780 net.cpp:] with loss weight
I1213 ::08.680280 net.cpp:] Memory required for data:
I1213 ::08.680280 net.cpp:] loss needs backward computation.
I1213 ::08.680280 net.cpp:] score needs backward computation.
I1213 ::08.680280 net.cpp:] upscore8 needs backward computation.
I1213 ::08.680280 net.cpp:] fuse_pool3 needs backward computation.
I1213 ::08.680280 net.cpp:] score_pool3c needs backward computation.
I1213 ::08.680280 net.cpp:] score_pool3 needs backward computation.
I1213 ::08.682281 net.cpp:] upscore_pool4_upscore_pool4_0_split needs backward computation.
I1213 ::08.682782 net.cpp:] upscore_pool4 needs backward computation.
I1213 ::08.682782 net.cpp:] fuse_pool4 needs backward computation.
I1213 ::08.682782 net.cpp:] score_pool4c needs backward computation.
I1213 ::08.683282 net.cpp:] score_pool4 needs backward computation.
I1213 ::08.683282 net.cpp:] upscore2_upscore2_0_split needs backward computation.
I1213 ::08.683282 net.cpp:] upscore2 needs backward computation.
I1213 ::08.683282 net.cpp:] score_fr needs backward computation.
I1213 ::08.683282 net.cpp:] drop7 needs backward computation.
I1213 ::08.683784 net.cpp:] relu7 needs backward computation.
I1213 ::08.683784 net.cpp:] fc7 needs backward computation.
I1213 ::08.683784 net.cpp:] drop6 needs backward computation.
I1213 ::08.683784 net.cpp:] relu6 needs backward computation.
I1213 ::08.684284 net.cpp:] fc6 needs backward computation.
I1213 ::08.684284 net.cpp:] pool5 needs backward computation.
I1213 ::08.684284 net.cpp:] relu5_3 needs backward computation.
I1213 ::08.684783 net.cpp:] conv5_3 needs backward computation.
I1213 ::08.685284 net.cpp:] relu5_2 needs backward computation.
I1213 ::08.685784 net.cpp:] conv5_2 needs backward computation.
I1213 ::08.686285 net.cpp:] relu5_1 needs backward computation.
I1213 ::08.686285 net.cpp:] conv5_1 needs backward computation.
I1213 ::08.686285 net.cpp:] pool4_pool4_0_split needs backward computation.
I1213 ::08.686285 net.cpp:] pool4 needs backward computation.
I1213 ::08.687286 net.cpp:] relu4_3 needs backward computation.
I1213 ::08.687286 net.cpp:] conv4_3 needs backward computation.
I1213 ::08.687286 net.cpp:] relu4_2 needs backward computation.
I1213 ::08.687286 net.cpp:] conv4_2 needs backward computation.
I1213 ::08.687286 net.cpp:] relu4_1 needs backward computation.
I1213 ::08.688787 net.cpp:] conv4_1 needs backward computation.
I1213 ::08.688787 net.cpp:] pool3_pool3_0_split needs backward computation.
I1213 ::08.688787 net.cpp:] pool3 needs backward computation.
I1213 ::08.689286 net.cpp:] relu3_3 needs backward computation.
I1213 ::08.690287 net.cpp:] conv3_3 needs backward computation.
I1213 ::08.690287 net.cpp:] relu3_2 needs backward computation.
I1213 ::08.690287 net.cpp:] conv3_2 needs backward computation.
I1213 ::08.690287 net.cpp:] relu3_1 needs backward computation.
I1213 ::08.691288 net.cpp:] conv3_1 needs backward computation.
I1213 ::08.691288 net.cpp:] pool2 needs backward computation.
I1213 ::08.691288 net.cpp:] relu2_2 needs backward computation.
I1213 ::08.691288 net.cpp:] conv2_2 needs backward computation.
I1213 ::08.691788 net.cpp:] relu2_1 needs backward computation.
I1213 ::08.692291 net.cpp:] conv2_1 needs backward computation.
I1213 ::08.692790 net.cpp:] pool1 needs backward computation.
I1213 ::08.692790 net.cpp:] relu1_2 needs backward computation.
I1213 ::08.692790 net.cpp:] conv1_2 needs backward computation.
I1213 ::08.693289 net.cpp:] relu1_1 needs backward computation.
I1213 ::08.693289 net.cpp:] conv1_1 needs backward computation.
I1213 ::08.693289 net.cpp:] label does not need backward computation.
I1213 ::08.693789 net.cpp:] data_data_0_split does not need backward computation.
I1213 ::08.694290 net.cpp:] data does not need backward computation.
I1213 ::08.694290 net.cpp:] This network produces output loss
I1213 ::08.694290 net.cpp:] Network initialization done.
I1213 ::08.695791 solver.cpp:] Creating test net (#) specified by net file: train_val.prototxt
I1213 ::08.695791 net.cpp:] The NetState phase () differed from the phase () specified by a rule in layer data
I1213 ::08.698796 net.cpp:] The NetState phase () differed from the phase () specified by a rule in layer label
I1213 ::08.699795 net.cpp:] Initializing net from parameters:
state {
phase: TEST
}
layer {
name: "data"
type: "Data"
top: "data"
include {
phase: TEST
}
transform_param {
scale: 0.00390625
mean_file: "G:/interest_of_imags_for_recognation/VOC2007/Resize224/Img_val_mean.binaryproto"
}
data_param {
source: "G:/interest_of_imags_for_recognation/VOC2007/Resize224/Img_val"
batch_size:
backend: LMDB
}
}
layer {
name: "label"
type: "Data"
top: "label"
include {
phase: TEST
}
transform_param {
scale: 0.00390625
mean_file: "G:/interest_of_imags_for_recognation/VOC2007/Resize224/Label_val_mean.binaryproto"
}
data_param {
source: "G:/interest_of_imags_for_recognation/VOC2007/Resize224/Label_val"
batch_size:
backend: LMDB
}
}
layer {
name: "conv1_1"
type: "Convolution"
bottom: "data"
top: "conv1_1"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
pad:
kernel_size:
stride:
}
}
layer {
name: "relu1_1"
type: "ReLU"
bottom: "conv1_1"
top: "conv1_1"
}
layer {
name: "conv1_2"
type: "Convolution"
bottom: "conv1_1"
top: "conv1_2"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
pad:
kernel_size:
stride:
}
}
layer {
name: "relu1_2"
type: "ReLU"
bottom: "conv1_2"
top: "conv1_2"
}
layer {
name: "pool1"
type: "Pooling"
bottom: "conv1_2"
top: "pool1"
pooling_param {
pool: MAX
kernel_size:
stride:
}
}
layer {
name: "conv2_1"
type: "Convolution"
bottom: "pool1"
top: "conv2_1"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
pad:
kernel_size:
stride:
}
}
layer {
name: "relu2_1"
type: "ReLU"
bottom: "conv2_1"
top: "conv2_1"
}
layer {
name: "conv2_2"
type: "Convolution"
bottom: "conv2_1"
top: "conv2_2"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
pad:
kernel_size:
stride:
}
}
layer {
name: "relu2_2"
type: "ReLU"
bottom: "conv2_2"
top: "conv2_2"
}
layer {
name: "pool2"
type: "Pooling"
bottom: "conv2_2"
top: "pool2"
pooling_param {
pool: MAX
kernel_size:
stride:
}
}
layer {
name: "conv3_1"
type: "Convolution"
bottom: "pool2"
top: "conv3_1"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
pad:
kernel_size:
stride:
}
}
layer {
name: "relu3_1"
type: "ReLU"
bottom: "conv3_1"
top: "conv3_1"
}
layer {
name: "conv3_2"
type: "Convolution"
bottom: "conv3_1"
top: "conv3_2"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
pad:
kernel_size:
stride:
}
}
layer {
name: "relu3_2"
type: "ReLU"
bottom: "conv3_2"
top: "conv3_2"
}
layer {
name: "conv3_3"
type: "Convolution"
bottom: "conv3_2"
top: "conv3_3"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
pad:
kernel_size:
stride:
}
}
layer {
name: "relu3_3"
type: "ReLU"
bottom: "conv3_3"
top: "conv3_3"
}
layer {
name: "pool3"
type: "Pooling"
bottom: "conv3_3"
top: "pool3"
pooling_param {
pool: MAX
kernel_size:
stride:
}
}
layer {
name: "conv4_1"
type: "Convolution"
bottom: "pool3"
top: "conv4_1"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
pad:
kernel_size:
stride:
}
}
layer {
name: "relu4_1"
type: "ReLU"
bottom: "conv4_1"
top: "conv4_1"
}
layer {
name: "conv4_2"
type: "Convolution"
bottom: "conv4_1"
top: "conv4_2"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
pad:
kernel_size:
stride:
}
}
layer {
name: "relu4_2"
type: "ReLU"
bottom: "conv4_2"
top: "conv4_2"
}
layer {
name: "conv4_3"
type: "Convolution"
bottom: "conv4_2"
top: "conv4_3"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
pad:
kernel_size:
stride:
}
}
layer {
name: "relu4_3"
type: "ReLU"
bottom: "conv4_3"
top: "conv4_3"
}
layer {
name: "pool4"
type: "Pooling"
bottom: "conv4_3"
top: "pool4"
pooling_param {
pool: MAX
kernel_size:
stride:
}
}
layer {
name: "conv5_1"
type: "Convolution"
bottom: "pool4"
top: "conv5_1"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
pad:
kernel_size:
stride:
}
}
layer {
name: "relu5_1"
type: "ReLU"
bottom: "conv5_1"
top: "conv5_1"
}
layer {
name: "conv5_2"
type: "Convolution"
bottom: "conv5_1"
top: "conv5_2"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
pad:
kernel_size:
stride:
}
}
layer {
name: "relu5_2"
type: "ReLU"
bottom: "conv5_2"
top: "conv5_2"
}
layer {
name: "conv5_3"
type: "Convolution"
bottom: "conv5_2"
top: "conv5_3"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
pad:
kernel_size:
stride:
}
}
layer {
name: "relu5_3"
type: "ReLU"
bottom: "conv5_3"
top: "conv5_3"
}
layer {
name: "pool5"
type: "Pooling"
bottom: "conv5_3"
top: "pool5"
pooling_param {
pool: MAX
kernel_size:
stride:
}
}
layer {
name: "fc6"
type: "Convolution"
bottom: "pool5"
top: "fc6"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
pad:
kernel_size:
stride:
}
}
layer {
name: "relu6"
type: "ReLU"
bottom: "fc6"
top: "fc6"
}
layer {
name: "drop6"
type: "Dropout"
bottom: "fc6"
top: "fc6"
dropout_param {
dropout_ratio: 0.5
}
}
layer {
name: "fc7"
type: "Convolution"
bottom: "fc6"
top: "fc7"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
pad:
kernel_size:
stride:
}
}
layer {
name: "relu7"
type: "ReLU"
bottom: "fc7"
top: "fc7"
}
layer {
name: "drop7"
type: "Dropout"
bottom: "fc7"
top: "fc7"
dropout_param {
dropout_ratio: 0.5
}
}
layer {
name: "score_fr"
type: "Convolution"
bottom: "fc7"
top: "score_fr"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
pad:
kernel_size:
}
}
layer {
name: "upscore2"
type: "Deconvolution"
bottom: "score_fr"
top: "upscore2"
param {
lr_mult:
}
convolution_param {
num_output:
bias_term: false
kernel_size:
stride:
}
}
layer {
name: "score_pool4"
type: "Convolution"
bottom: "pool4"
top: "score_pool4"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
pad:
kernel_size:
}
}
layer {
name: "score_pool4c"
type: "Crop"
bottom: "score_pool4"
bottom: "upscore2"
top: "score_pool4c"
crop_param {
axis:
offset:
}
}
layer {
name: "fuse_pool4"
type: "Eltwise"
bottom: "upscore2"
bottom: "score_pool4c"
top: "fuse_pool4"
eltwise_param {
operation: SUM
}
}
layer {
name: "upscore_pool4"
type: "Deconvolution"
bottom: "fuse_pool4"
top: "upscore_pool4"
param {
lr_mult:
}
convolution_param {
num_output:
bias_term: false
kernel_size:
stride:
}
}
layer {
name: "score_pool3"
type: "Convolution"
bottom: "pool3"
top: "score_pool3"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
pad:
kernel_size:
}
}
layer {
name: "score_pool3c"
type: "Crop"
bottom: "score_pool3"
bottom: "upscore_pool4"
top: "score_pool3c"
crop_param {
axis:
offset:
}
}
layer {
name: "fuse_pool3"
type: "Eltwise"
bottom: "upscore_pool4"
bottom: "score_pool3c"
top: "fuse_pool3"
eltwise_param {
operation: SUM
}
}
layer {
name: "upscore8"
type: "Deconvolution"
bottom: "fuse_pool3"
top: "upscore8"
param {
lr_mult:
}
convolution_param {
num_output:
bias_term: false
kernel_size:
stride:
}
}
layer {
name: "score"
type: "Crop"
bottom: "upscore8"
bottom: "data"
top: "score"
crop_param {
axis:
offset:
}
}
layer {
name: "accuracy"
type: "Accuracy"
bottom: "score"
bottom: "label"
top: "accuracy"
include {
phase: TEST
}
}
layer {
name: "loss"
type: "SoftmaxWithLoss"
bottom: "score"
bottom: "label"
top: "loss"
loss_param {
ignore_label:
normalize: false
}
}
I1213 ::08.702296 layer_factory.hpp:] Creating layer data
I1213 ::08.703297 net.cpp:] Creating Layer data
I1213 ::08.704798 net.cpp:] data -> data
I1213 ::08.705298 common.cpp:] System entropy source not available, using fallback algorithm to generate seed instead.
I1213 ::08.706300 data_transformer.cpp:] Loading mean file from: G:/interest_of_imags_for_recognation/VOC2007/Resize224/Img_val_mean.binaryproto
I1213 ::08.707300 db_lmdb.cpp:] Opened lmdb G:/interest_of_imags_for_recognation/VOC2007/Resize224/Img_val
I1213 ::08.709803 data_layer.cpp:] output data size: ,,,
I1213 ::08.715806 net.cpp:] Setting up data
I1213 ::08.716306 net.cpp:] Top shape: ()
I1213 ::08.716807 net.cpp:] Memory required for data:
I1213 ::08.716807 layer_factory.hpp:] Creating layer data_data_0_split
I1213 ::08.717808 common.cpp:] System entropy source not available, using fallback algorithm to generate seed instead.
I1213 ::08.718808 net.cpp:] Creating Layer data_data_0_split
I1213 ::08.720309 net.cpp:] data_data_0_split <- data
I1213 ::08.722811 net.cpp:] data_data_0_split -> data_data_0_split_0
I1213 ::08.723311 net.cpp:] data_data_0_split -> data_data_0_split_1
I1213 ::08.723311 net.cpp:] Setting up data_data_0_split
I1213 ::08.723811 net.cpp:] Top shape: ()
I1213 ::08.724812 net.cpp:] Top shape: ()
I1213 ::08.724812 net.cpp:] Memory required for data:
I1213 ::08.724812 layer_factory.hpp:] Creating layer label
I1213 ::08.725312 net.cpp:] Creating Layer label
I1213 ::08.725813 net.cpp:] label -> label
I1213 ::08.727314 data_transformer.cpp:] Loading mean file from: G:/interest_of_imags_for_recognation/VOC2007/Resize224/Label_val_mean.binaryproto
I1213 ::08.727814 common.cpp:] System entropy source not available, using fallback algorithm to generate seed instead.
I1213 ::08.730816 db_lmdb.cpp:] Opened lmdb G:/interest_of_imags_for_recognation/VOC2007/Resize224/Label_val
I1213 ::08.731317 data_layer.cpp:] output data size: ,,,
I1213 ::08.733319 net.cpp:] Setting up label
I1213 ::08.733319 net.cpp:] Top shape: ()
I1213 ::08.734318 common.cpp:] System entropy source not available, using fallback algorithm to generate seed instead.
I1213 ::08.734819 net.cpp:] Memory required for data:
I1213 ::08.736820 layer_factory.hpp:] Creating layer label_label_0_split
I1213 ::08.737321 net.cpp:] Creating Layer label_label_0_split
I1213 ::08.738822 net.cpp:] label_label_0_split <- label
I1213 ::08.739823 net.cpp:] label_label_0_split -> label_label_0_split_0
I1213 ::08.739823 net.cpp:] label_label_0_split -> label_label_0_split_1
I1213 ::08.740324 net.cpp:] Setting up label_label_0_split
I1213 ::08.740823 net.cpp:] Top shape: ()
I1213 ::08.741324 net.cpp:] Top shape: ()
I1213 ::08.742324 net.cpp:] Memory required for data:
I1213 ::08.743825 layer_factory.hpp:] Creating layer conv1_1
I1213 ::08.744827 net.cpp:] Creating Layer conv1_1
I1213 ::08.745326 net.cpp:] conv1_1 <- data_data_0_split_0
I1213 ::08.746327 net.cpp:] conv1_1 -> conv1_1
I1213 ::08.749830 net.cpp:] Setting up conv1_1
I1213 ::08.749830 net.cpp:] Top shape: ()
I1213 ::08.750830 net.cpp:] Memory required for data:
I1213 ::08.751332 layer_factory.hpp:] Creating layer relu1_1
I1213 ::08.751832 net.cpp:] Creating Layer relu1_1
I1213 ::08.752832 net.cpp:] relu1_1 <- conv1_1
I1213 ::08.753332 net.cpp:] relu1_1 -> conv1_1 (in-place)
I1213 ::08.756836 net.cpp:] Setting up relu1_1
I1213 ::08.757336 net.cpp:] Top shape: ()
I1213 ::08.757835 net.cpp:] Memory required for data:
I1213 ::08.760339 layer_factory.hpp:] Creating layer conv1_2
I1213 ::08.761338 net.cpp:] Creating Layer conv1_2
I1213 ::08.761838 net.cpp:] conv1_2 <- conv1_1
I1213 ::08.762339 net.cpp:] conv1_2 -> conv1_2
I1213 ::08.767343 net.cpp:] Setting up conv1_2
I1213 ::08.767843 net.cpp:] Top shape: ()
I1213 ::08.768343 net.cpp:] Memory required for data:
I1213 ::08.769345 layer_factory.hpp:] Creating layer relu1_2
I1213 ::08.769845 net.cpp:] Creating Layer relu1_2
I1213 ::08.771845 net.cpp:] relu1_2 <- conv1_2
I1213 ::08.772346 net.cpp:] relu1_2 -> conv1_2 (in-place)
I1213 ::08.775348 net.cpp:] Setting up relu1_2
I1213 ::08.775849 net.cpp:] Top shape: ()
I1213 ::08.776350 net.cpp:] Memory required for data:
I1213 ::08.777349 layer_factory.hpp:] Creating layer pool1
I1213 ::08.778350 net.cpp:] Creating Layer pool1
I1213 ::08.778851 net.cpp:] pool1 <- conv1_2
I1213 ::08.779851 net.cpp:] pool1 -> pool1
I1213 ::08.780853 net.cpp:] Setting up pool1
I1213 ::08.781352 net.cpp:] Top shape: ()
I1213 ::08.782353 net.cpp:] Memory required for data:
I1213 ::08.782853 layer_factory.hpp:] Creating layer conv2_1
I1213 ::08.783854 net.cpp:] Creating Layer conv2_1
I1213 ::08.784854 net.cpp:] conv2_1 <- pool1
I1213 ::08.785356 net.cpp:] conv2_1 -> conv2_1
I1213 ::08.791860 net.cpp:] Setting up conv2_1
I1213 ::08.791860 net.cpp:] Top shape: ()
I1213 ::08.792861 net.cpp:] Memory required for data:
I1213 ::08.793861 layer_factory.hpp:] Creating layer relu2_1
I1213 ::08.794363 net.cpp:] Creating Layer relu2_1
I1213 ::08.794862 net.cpp:] relu2_1 <- conv2_1
I1213 ::08.795362 net.cpp:] relu2_1 -> conv2_1 (in-place)
I1213 ::08.796363 net.cpp:] Setting up relu2_1
I1213 ::08.796363 net.cpp:] Top shape: ()
I1213 ::08.796864 net.cpp:] Memory required for data:
I1213 ::08.797363 layer_factory.hpp:] Creating layer conv2_2
I1213 ::08.797864 net.cpp:] Creating Layer conv2_2
I1213 ::08.798364 net.cpp:] conv2_2 <- conv2_1
I1213 ::08.798864 net.cpp:] conv2_2 -> conv2_2
I1213 ::08.802367 net.cpp:] Setting up conv2_2
I1213 ::08.802367 net.cpp:] Top shape: ()
I1213 ::08.803367 net.cpp:] Memory required for data:
I1213 ::08.803869 layer_factory.hpp:] Creating layer relu2_2
I1213 ::08.804869 net.cpp:] Creating Layer relu2_2
I1213 ::08.807371 net.cpp:] relu2_2 <- conv2_2
I1213 ::08.808372 net.cpp:] relu2_2 -> conv2_2 (in-place)
I1213 ::08.809875 net.cpp:] Setting up relu2_2
I1213 ::08.810374 net.cpp:] Top shape: ()
I1213 ::08.810874 net.cpp:] Memory required for data:
I1213 ::08.811373 layer_factory.hpp:] Creating layer pool2
I1213 ::08.811874 net.cpp:] Creating Layer pool2
I1213 ::08.812374 net.cpp:] pool2 <- conv2_2
I1213 ::08.812875 net.cpp:] pool2 -> pool2
I1213 ::08.813375 net.cpp:] Setting up pool2
I1213 ::08.813875 net.cpp:] Top shape: ()
I1213 ::08.814376 net.cpp:] Memory required for data:
I1213 ::08.814877 layer_factory.hpp:] Creating layer conv3_1
I1213 ::08.815376 net.cpp:] Creating Layer conv3_1
I1213 ::08.815877 net.cpp:] conv3_1 <- pool2
I1213 ::08.816377 net.cpp:] conv3_1 -> conv3_1
I1213 ::08.819380 net.cpp:] Setting up conv3_1
I1213 ::08.819380 net.cpp:] Top shape: ()
I1213 ::08.819880 net.cpp:] Memory required for data:
I1213 ::08.822382 layer_factory.hpp:] Creating layer relu3_1
I1213 ::08.823384 net.cpp:] Creating Layer relu3_1
I1213 ::08.823884 net.cpp:] relu3_1 <- conv3_1
I1213 ::08.824383 net.cpp:] relu3_1 -> conv3_1 (in-place)
I1213 ::08.826386 net.cpp:] Setting up relu3_1
I1213 ::08.826886 net.cpp:] Top shape: ()
I1213 ::08.827386 net.cpp:] Memory required for data:
I1213 ::08.828387 layer_factory.hpp:] Creating layer conv3_2
I1213 ::08.828887 net.cpp:] Creating Layer conv3_2
I1213 ::08.829887 net.cpp:] conv3_2 <- conv3_1
I1213 ::08.830387 net.cpp:] conv3_2 -> conv3_2
I1213 ::08.838393 net.cpp:] Setting up conv3_2
I1213 ::08.838393 net.cpp:] Top shape: ()
I1213 ::08.838893 net.cpp:] Memory required for data:
I1213 ::08.840395 layer_factory.hpp:] Creating layer relu3_2
I1213 ::08.840894 net.cpp:] Creating Layer relu3_2
I1213 ::08.841395 net.cpp:] relu3_2 <- conv3_2
I1213 ::08.841895 net.cpp:] relu3_2 -> conv3_2 (in-place)
I1213 ::08.842896 net.cpp:] Setting up relu3_2
I1213 ::08.842896 net.cpp:] Top shape: ()
I1213 ::08.843397 net.cpp:] Memory required for data:
I1213 ::08.844398 layer_factory.hpp:] Creating layer conv3_3
I1213 ::08.844898 net.cpp:] Creating Layer conv3_3
I1213 ::08.845397 net.cpp:] conv3_3 <- conv3_2
I1213 ::08.845899 net.cpp:] conv3_3 -> conv3_3
I1213 ::08.850401 net.cpp:] Setting up conv3_3
I1213 ::08.850401 net.cpp:] Top shape: ()
I1213 ::08.851402 net.cpp:] Memory required for data:
I1213 ::08.851903 layer_factory.hpp:] Creating layer relu3_3
I1213 ::08.852403 net.cpp:] Creating Layer relu3_3
I1213 ::08.852903 net.cpp:] relu3_3 <- conv3_3
I1213 ::08.853404 net.cpp:] relu3_3 -> conv3_3 (in-place)
I1213 ::08.854964 net.cpp:] Setting up relu3_3
I1213 ::08.855406 net.cpp:] Top shape: ()
I1213 ::08.855906 net.cpp:] Memory required for data:
I1213 ::08.856405 layer_factory.hpp:] Creating layer pool3
I1213 ::08.856906 net.cpp:] Creating Layer pool3
I1213 ::08.857406 net.cpp:] pool3 <- conv3_3
I1213 ::08.857906 net.cpp:] pool3 -> pool3
I1213 ::08.858907 net.cpp:] Setting up pool3
I1213 ::08.858907 net.cpp:] Top shape: ()
I1213 ::08.859908 net.cpp:] Memory required for data:
I1213 ::08.860409 layer_factory.hpp:] Creating layer pool3_pool3_0_split
I1213 ::08.860909 net.cpp:] Creating Layer pool3_pool3_0_split
I1213 ::08.860909 net.cpp:] pool3_pool3_0_split <- pool3
I1213 ::08.861409 net.cpp:] pool3_pool3_0_split -> pool3_pool3_0_split_0
I1213 ::08.861910 net.cpp:] pool3_pool3_0_split -> pool3_pool3_0_split_1
I1213 ::08.862910 net.cpp:] Setting up pool3_pool3_0_split
I1213 ::08.862910 net.cpp:] Top shape: ()
I1213 ::08.863410 net.cpp:] Top shape: ()
I1213 ::08.863910 net.cpp:] Memory required for data:
I1213 ::08.864411 layer_factory.hpp:] Creating layer conv4_1
I1213 ::08.864912 net.cpp:] Creating Layer conv4_1
I1213 ::08.865412 net.cpp:] conv4_1 <- pool3_pool3_0_split_0
I1213 ::08.865913 net.cpp:] conv4_1 -> conv4_1
I1213 ::08.871917 net.cpp:] Setting up conv4_1
I1213 ::08.871917 net.cpp:] Top shape: ()
I1213 ::08.872918 net.cpp:] Memory required for data:
I1213 ::08.873417 layer_factory.hpp:] Creating layer relu4_1
I1213 ::08.874919 net.cpp:] Creating Layer relu4_1
I1213 ::08.875419 net.cpp:] relu4_1 <- conv4_1
I1213 ::08.875921 net.cpp:] relu4_1 -> conv4_1 (in-place)
I1213 ::08.877421 net.cpp:] Setting up relu4_1
I1213 ::08.877421 net.cpp:] Top shape: ()
I1213 ::08.877921 net.cpp:] Memory required for data:
I1213 ::08.878422 layer_factory.hpp:] Creating layer conv4_2
I1213 ::08.878922 net.cpp:] Creating Layer conv4_2
I1213 ::08.879422 net.cpp:] conv4_2 <- conv4_1
I1213 ::08.879923 net.cpp:] conv4_2 -> conv4_2
I1213 ::08.885927 net.cpp:] Setting up conv4_2
I1213 ::08.885927 net.cpp:] Top shape: ()
I1213 ::08.886929 net.cpp:] Memory required for data:
I1213 ::08.886929 layer_factory.hpp:] Creating layer relu4_2
I1213 ::08.886929 net.cpp:] Creating Layer relu4_2
I1213 ::08.886929 net.cpp:] relu4_2 <- conv4_2
I1213 ::08.886929 net.cpp:] relu4_2 -> conv4_2 (in-place)
I1213 ::08.887929 net.cpp:] Setting up relu4_2
I1213 ::08.888429 net.cpp:] Top shape: ()
I1213 ::08.888929 net.cpp:] Memory required for data:
I1213 ::08.890933 layer_factory.hpp:] Creating layer conv4_3
I1213 ::08.891433 net.cpp:] Creating Layer conv4_3
I1213 ::08.891433 net.cpp:] conv4_3 <- conv4_2
I1213 ::08.891433 net.cpp:] conv4_3 -> conv4_3
I1213 ::08.897935 net.cpp:] Setting up conv4_3
I1213 ::08.897935 net.cpp:] Top shape: ()
I1213 ::08.898437 net.cpp:] Memory required for data:
I1213 ::08.898936 layer_factory.hpp:] Creating layer relu4_3
I1213 ::08.899437 net.cpp:] Creating Layer relu4_3
I1213 ::08.899937 net.cpp:] relu4_3 <- conv4_3
I1213 ::08.900437 net.cpp:] relu4_3 -> conv4_3 (in-place)
I1213 ::08.901938 net.cpp:] Setting up relu4_3
I1213 ::08.902438 net.cpp:] Top shape: ()
I1213 ::08.902938 net.cpp:] Memory required for data:
I1213 ::08.903439 layer_factory.hpp:] Creating layer pool4
I1213 ::08.903939 net.cpp:] Creating Layer pool4
I1213 ::08.904940 net.cpp:] pool4 <- conv4_3
I1213 ::08.907443 net.cpp:] pool4 -> pool4
I1213 ::08.907443 net.cpp:] Setting up pool4
I1213 ::08.907943 net.cpp:] Top shape: ()
I1213 ::08.908443 net.cpp:] Memory required for data:
I1213 ::08.908443 layer_factory.hpp:] Creating layer pool4_pool4_0_split
I1213 ::08.908443 net.cpp:] Creating Layer pool4_pool4_0_split
I1213 ::08.908943 net.cpp:] pool4_pool4_0_split <- pool4
I1213 ::08.909443 net.cpp:] pool4_pool4_0_split -> pool4_pool4_0_split_0
I1213 ::08.909945 net.cpp:] pool4_pool4_0_split -> pool4_pool4_0_split_1
I1213 ::08.910444 net.cpp:] Setting up pool4_pool4_0_split
I1213 ::08.910944 net.cpp:] Top shape: ()
I1213 ::08.911445 net.cpp:] Top shape: ()
I1213 ::08.911445 net.cpp:] Memory required for data:
I1213 ::08.911445 layer_factory.hpp:] Creating layer conv5_1
I1213 ::08.911945 net.cpp:] Creating Layer conv5_1
I1213 ::08.912446 net.cpp:] conv5_1 <- pool4_pool4_0_split_0
I1213 ::08.912946 net.cpp:] conv5_1 -> conv5_1
I1213 ::08.919451 net.cpp:] Setting up conv5_1
I1213 ::08.919451 net.cpp:] Top shape: ()
I1213 ::08.919951 net.cpp:] Memory required for data:
I1213 ::08.922454 layer_factory.hpp:] Creating layer relu5_1
I1213 ::08.922953 net.cpp:] Creating Layer relu5_1
I1213 ::08.923954 net.cpp:] relu5_1 <- conv5_1
I1213 ::08.923954 net.cpp:] relu5_1 -> conv5_1 (in-place)
I1213 ::08.924454 net.cpp:] Setting up relu5_1
I1213 ::08.924955 net.cpp:] Top shape: ()
I1213 ::08.925456 net.cpp:] Memory required for data:
I1213 ::08.925956 layer_factory.hpp:] Creating layer conv5_2
I1213 ::08.926456 net.cpp:] Creating Layer conv5_2
I1213 ::08.926956 net.cpp:] conv5_2 <- conv5_1
I1213 ::08.927458 net.cpp:] conv5_2 -> conv5_2
I1213 ::08.933961 net.cpp:] Setting up conv5_2
I1213 ::08.933961 net.cpp:] Top shape: ()
I1213 ::08.934463 net.cpp:] Memory required for data:
I1213 ::08.934962 layer_factory.hpp:] Creating layer relu5_2
I1213 ::08.935462 net.cpp:] Creating Layer relu5_2
I1213 ::08.938464 net.cpp:] relu5_2 <- conv5_2
I1213 ::08.938464 net.cpp:] relu5_2 -> conv5_2 (in-place)
I1213 ::08.939966 net.cpp:] Setting up relu5_2
I1213 ::08.940466 net.cpp:] Top shape: ()
I1213 ::08.940966 net.cpp:] Memory required for data:
I1213 ::08.941467 layer_factory.hpp:] Creating layer conv5_3
I1213 ::08.942467 net.cpp:] Creating Layer conv5_3
I1213 ::08.942467 net.cpp:] conv5_3 <- conv5_2
I1213 ::08.942467 net.cpp:] conv5_3 -> conv5_3
I1213 ::08.948472 net.cpp:] Setting up conv5_3
I1213 ::08.948472 net.cpp:] Top shape: ()
I1213 ::08.948973 net.cpp:] Memory required for data:
I1213 ::08.949973 layer_factory.hpp:] Creating layer relu5_3
I1213 ::08.950474 net.cpp:] Creating Layer relu5_3
I1213 ::08.950973 net.cpp:] relu5_3 <- conv5_3
I1213 ::08.950973 net.cpp:] relu5_3 -> conv5_3 (in-place)
I1213 ::08.951975 net.cpp:] Setting up relu5_3
I1213 ::08.952976 net.cpp:] Top shape: ()
I1213 ::08.952976 net.cpp:] Memory required for data:
I1213 ::08.952976 layer_factory.hpp:] Creating layer pool5
I1213 ::08.952976 net.cpp:] Creating Layer pool5
I1213 ::08.952976 net.cpp:] pool5 <- conv5_3
I1213 ::08.953476 net.cpp:] pool5 -> pool5
I1213 ::08.953476 net.cpp:] Setting up pool5
I1213 ::08.954977 net.cpp:] Top shape: ()
I1213 ::08.955476 net.cpp:] Memory required for data:
I1213 ::08.955977 layer_factory.hpp:] Creating layer fc6
I1213 ::08.956979 net.cpp:] Creating Layer fc6
I1213 ::08.957479 net.cpp:] fc6 <- pool5
I1213 ::08.957979 net.cpp:] fc6 -> fc6
I1213 ::09.144121 net.cpp:] Setting up fc6
I1213 ::09.144121 net.cpp:] Top shape: ()
I1213 ::09.144611 net.cpp:] Memory required for data:
I1213 ::09.145612 layer_factory.hpp:] Creating layer relu6
I1213 ::09.146113 net.cpp:] Creating Layer relu6
I1213 ::09.146613 net.cpp:] relu6 <- fc6
I1213 ::09.147114 net.cpp:] relu6 -> fc6 (in-place)
I1213 ::09.148114 net.cpp:] Setting up relu6
I1213 ::09.148114 net.cpp:] Top shape: ()
I1213 ::09.148614 net.cpp:] Memory required for data:
I1213 ::09.149114 layer_factory.hpp:] Creating layer drop6
I1213 ::09.149616 net.cpp:] Creating Layer drop6
I1213 ::09.150615 net.cpp:] drop6 <- fc6
I1213 ::09.151116 net.cpp:] drop6 -> fc6 (in-place)
I1213 ::09.151617 net.cpp:] Setting up drop6
I1213 ::09.152117 net.cpp:] Top shape: ()
I1213 ::09.153617 net.cpp:] Memory required for data:
I1213 ::09.154119 layer_factory.hpp:] Creating layer fc7
I1213 ::09.154618 net.cpp:] Creating Layer fc7
I1213 ::09.155119 net.cpp:] fc7 <- fc6
I1213 ::09.155619 net.cpp:] fc7 -> fc7
I1213 ::09.190145 net.cpp:] Setting up fc7
I1213 ::09.190145 net.cpp:] Top shape: ()
I1213 ::09.191145 net.cpp:] Memory required for data:
I1213 ::09.191645 layer_factory.hpp:] Creating layer relu7
I1213 ::09.192145 net.cpp:] Creating Layer relu7
I1213 ::09.192646 net.cpp:] relu7 <- fc7
I1213 ::09.193145 net.cpp:] relu7 -> fc7 (in-place)
I1213 ::09.194146 net.cpp:] Setting up relu7
I1213 ::09.194146 net.cpp:] Top shape: ()
I1213 ::09.194648 net.cpp:] Memory required for data:
I1213 ::09.195147 layer_factory.hpp:] Creating layer drop7
I1213 ::09.195647 net.cpp:] Creating Layer drop7
I1213 ::09.196148 net.cpp:] drop7 <- fc7
I1213 ::09.196648 net.cpp:] drop7 -> fc7 (in-place)
I1213 ::09.197149 net.cpp:] Setting up drop7
I1213 ::09.197649 net.cpp:] Top shape: ()
I1213 ::09.198149 net.cpp:] Memory required for data:
I1213 ::09.198650 layer_factory.hpp:] Creating layer score_fr
I1213 ::09.200150 net.cpp:] Creating Layer score_fr
I1213 ::09.200651 net.cpp:] score_fr <- fc7
I1213 ::09.201653 net.cpp:] score_fr -> score_fr
I1213 ::09.203654 net.cpp:] Setting up score_fr
I1213 ::09.204154 net.cpp:] Top shape: ()
I1213 ::09.204654 net.cpp:] Memory required for data:
I1213 ::09.205155 layer_factory.hpp:] Creating layer upscore2
I1213 ::09.205656 net.cpp:] Creating Layer upscore2
I1213 ::09.206156 net.cpp:] upscore2 <- score_fr
I1213 ::09.207156 net.cpp:] upscore2 -> upscore2
I1213 ::09.207656 net.cpp:] Setting up upscore2
I1213 ::09.208156 net.cpp:] Top shape: ()
I1213 ::09.208657 net.cpp:] Memory required for data:
I1213 ::09.209157 layer_factory.hpp:] Creating layer upscore2_upscore2_0_split
I1213 ::09.209657 net.cpp:] Creating Layer upscore2_upscore2_0_split
I1213 ::09.210157 net.cpp:] upscore2_upscore2_0_split <- upscore2
I1213 ::09.210659 net.cpp:] upscore2_upscore2_0_split -> upscore2_upscore2_0_split_0
I1213 ::09.211158 net.cpp:] upscore2_upscore2_0_split -> upscore2_upscore2_0_split_1
I1213 ::09.211659 net.cpp:] Setting up upscore2_upscore2_0_split
I1213 ::09.212159 net.cpp:] Top shape: ()
I1213 ::09.212661 net.cpp:] Top shape: ()
I1213 ::09.213160 net.cpp:] Memory required for data:
I1213 ::09.213660 layer_factory.hpp:] Creating layer score_pool4
I1213 ::09.214160 net.cpp:] Creating Layer score_pool4
I1213 ::09.216163 net.cpp:] score_pool4 <- pool4_pool4_0_split_1
I1213 ::09.216663 net.cpp:] score_pool4 -> score_pool4
I1213 ::09.219164 net.cpp:] Setting up score_pool4
I1213 ::09.219666 net.cpp:] Top shape: ()
I1213 ::09.220165 net.cpp:] Memory required for data:
I1213 ::09.220665 layer_factory.hpp:] Creating layer score_pool4c
I1213 ::09.221166 net.cpp:] Creating Layer score_pool4c
I1213 ::09.221667 net.cpp:] score_pool4c <- score_pool4
I1213 ::09.222167 net.cpp:] score_pool4c <- upscore2_upscore2_0_split_0
I1213 ::09.222667 net.cpp:] score_pool4c -> score_pool4c
I1213 ::09.223167 net.cpp:] Setting up score_pool4c
I1213 ::09.223667 net.cpp:] Top shape: ()
I1213 ::09.224169 net.cpp:] Memory required for data:
I1213 ::09.224668 layer_factory.hpp:] Creating layer fuse_pool4
I1213 ::09.225168 net.cpp:] Creating Layer fuse_pool4
I1213 ::09.225668 net.cpp:] fuse_pool4 <- upscore2_upscore2_0_split_1
I1213 ::09.226169 net.cpp:] fuse_pool4 <- score_pool4c
I1213 ::09.226670 net.cpp:] fuse_pool4 -> fuse_pool4
I1213 ::09.227170 net.cpp:] Setting up fuse_pool4
I1213 ::09.227670 net.cpp:] Top shape: ()
I1213 ::09.228171 net.cpp:] Memory required for data:
I1213 ::09.228672 layer_factory.hpp:] Creating layer upscore_pool4
I1213 ::09.229171 net.cpp:] Creating Layer upscore_pool4
I1213 ::09.229672 net.cpp:] upscore_pool4 <- fuse_pool4
I1213 ::09.231673 net.cpp:] upscore_pool4 -> upscore_pool4
I1213 ::09.233175 net.cpp:] Setting up upscore_pool4
I1213 ::09.233175 net.cpp:] Top shape: ()
I1213 ::09.233675 net.cpp:] Memory required for data:
I1213 ::09.234175 layer_factory.hpp:] Creating layer upscore_pool4_upscore_pool4_0_split
I1213 ::09.234676 net.cpp:] Creating Layer upscore_pool4_upscore_pool4_0_split
I1213 ::09.235177 net.cpp:] upscore_pool4_upscore_pool4_0_split <- upscore_pool4
I1213 ::09.235677 net.cpp:] upscore_pool4_upscore_pool4_0_split -> upscore_pool4_upscore_pool4_0_split_0
I1213 ::09.236176 net.cpp:] upscore_pool4_upscore_pool4_0_split -> upscore_pool4_upscore_pool4_0_split_1
I1213 ::09.236677 net.cpp:] Setting up upscore_pool4_upscore_pool4_0_split
I1213 ::09.237177 net.cpp:] Top shape: ()
I1213 ::09.238178 net.cpp:] Top shape: ()
I1213 ::09.238679 net.cpp:] Memory required for data:
I1213 ::09.239179 layer_factory.hpp:] Creating layer score_pool3
I1213 ::09.239680 net.cpp:] Creating Layer score_pool3
I1213 ::09.240180 net.cpp:] score_pool3 <- pool3_pool3_0_split_1
I1213 ::09.240680 net.cpp:] score_pool3 -> score_pool3
I1213 ::09.243181 net.cpp:] Setting up score_pool3
I1213 ::09.243682 net.cpp:] Top shape: ()
I1213 ::09.244182 net.cpp:] Memory required for data:
I1213 ::09.244684 layer_factory.hpp:] Creating layer score_pool3c
I1213 ::09.245184 net.cpp:] Creating Layer score_pool3c
I1213 ::09.246685 net.cpp:] score_pool3c <- score_pool3
I1213 ::09.247186 net.cpp:] score_pool3c <- upscore_pool4_upscore_pool4_0_split_0
I1213 ::09.247687 net.cpp:] score_pool3c -> score_pool3c
I1213 ::09.248687 net.cpp:] Setting up score_pool3c
I1213 ::09.248687 net.cpp:] Top shape: ()
I1213 ::09.249187 net.cpp:] Memory required for data:
I1213 ::09.250187 layer_factory.hpp:] Creating layer fuse_pool3
I1213 ::09.250687 net.cpp:] Creating Layer fuse_pool3
I1213 ::09.251188 net.cpp:] fuse_pool3 <- upscore_pool4_upscore_pool4_0_split_1
I1213 ::09.251688 net.cpp:] fuse_pool3 <- score_pool3c
I1213 ::09.252189 net.cpp:] fuse_pool3 -> fuse_pool3
I1213 ::09.252689 net.cpp:] Setting up fuse_pool3
I1213 ::09.253190 net.cpp:] Top shape: ()
I1213 ::09.253690 net.cpp:] Memory required for data:
I1213 ::09.254191 layer_factory.hpp:] Creating layer upscore8
I1213 ::09.255190 net.cpp:] Creating Layer upscore8
I1213 ::09.255691 net.cpp:] upscore8 <- fuse_pool3
I1213 ::09.256192 net.cpp:] upscore8 -> upscore8
I1213 ::09.257701 net.cpp:] Setting up upscore8
I1213 ::09.258193 net.cpp:] Top shape: ()
I1213 ::09.258694 net.cpp:] Memory required for data:
I1213 ::09.259194 layer_factory.hpp:] Creating layer score
I1213 ::09.259694 net.cpp:] Creating Layer score
I1213 ::09.260195 net.cpp:] score <- upscore8
I1213 ::09.260695 net.cpp:] score <- data_data_0_split_1
I1213 ::09.262195 net.cpp:] score -> score
I1213 ::09.263198 net.cpp:] Setting up score
I1213 ::09.263696 net.cpp:] Top shape: ()
I1213 ::09.264197 net.cpp:] Memory required for data:
I1213 ::09.264698 layer_factory.hpp:] Creating layer score_score_0_split
I1213 ::09.265198 net.cpp:] Creating Layer score_score_0_split
I1213 ::09.265699 net.cpp:] score_score_0_split <- score
I1213 ::09.266199 net.cpp:] score_score_0_split -> score_score_0_split_0
I1213 ::09.266700 net.cpp:] score_score_0_split -> score_score_0_split_1
I1213 ::09.267200 net.cpp:] Setting up score_score_0_split
I1213 ::09.267700 net.cpp:] Top shape: ()
I1213 ::09.268200 net.cpp:] Top shape: ()
I1213 ::09.268700 net.cpp:] Memory required for data:
I1213 ::09.269201 layer_factory.hpp:] Creating layer accuracy
I1213 ::09.269701 net.cpp:] Creating Layer accuracy
I1213 ::09.270202 net.cpp:] accuracy <- score_score_0_split_0
I1213 ::09.270702 net.cpp:] accuracy <- label_label_0_split_0
I1213 ::09.271703 net.cpp:] accuracy -> accuracy
I1213 ::09.272202 net.cpp:] Setting up accuracy
I1213 ::09.272703 net.cpp:] Top shape: ()
I1213 ::09.273203 net.cpp:] Memory required for data:
I1213 ::09.273704 layer_factory.hpp:] Creating layer loss
I1213 ::09.274204 net.cpp:] Creating Layer loss
I1213 ::09.274704 net.cpp:] loss <- score_score_0_split_1
I1213 ::09.275205 net.cpp:] loss <- label_label_0_split_1
I1213 ::09.275707 net.cpp:] loss -> loss
I1213 ::09.276206 layer_factory.hpp:] Creating layer loss
I1213 ::09.279708 net.cpp:] Setting up loss
I1213 ::09.280208 net.cpp:] Top shape: ()
I1213 ::09.280709 net.cpp:] with loss weight
I1213 ::09.281208 net.cpp:] Memory required for data:
I1213 ::09.281708 net.cpp:] loss needs backward computation.
I1213 ::09.282209 net.cpp:] accuracy does not need backward computation.
I1213 ::09.282709 net.cpp:] score_score_0_split needs backward computation.
I1213 ::09.283210 net.cpp:] score needs backward computation.
I1213 ::09.283710 net.cpp:] upscore8 needs backward computation.
I1213 ::09.284210 net.cpp:] fuse_pool3 needs backward computation.
I1213 ::09.284711 net.cpp:] score_pool3c needs backward computation.
I1213 ::09.285212 net.cpp:] score_pool3 needs backward computation.
I1213 ::09.285712 net.cpp:] upscore_pool4_upscore_pool4_0_split needs backward computation.
I1213 ::09.286212 net.cpp:] upscore_pool4 needs backward computation.
I1213 ::09.286712 net.cpp:] fuse_pool4 needs backward computation.
I1213 ::09.287214 net.cpp:] score_pool4c needs backward computation.
I1213 ::09.287714 net.cpp:] score_pool4 needs backward computation.
I1213 ::09.288714 net.cpp:] upscore2_upscore2_0_split needs backward computation.
I1213 ::09.289214 net.cpp:] upscore2 needs backward computation.
I1213 ::09.289715 net.cpp:] score_fr needs backward computation.
I1213 ::09.290215 net.cpp:] drop7 needs backward computation.
I1213 ::09.290715 net.cpp:] relu7 needs backward computation.
I1213 ::09.291216 net.cpp:] fc7 needs backward computation.
I1213 ::09.291716 net.cpp:] drop6 needs backward computation.
I1213 ::09.293217 net.cpp:] relu6 needs backward computation.
I1213 ::09.294219 net.cpp:] fc6 needs backward computation.
I1213 ::09.294718 net.cpp:] pool5 needs backward computation.
I1213 ::09.295218 net.cpp:] relu5_3 needs backward computation.
I1213 ::09.295719 net.cpp:] conv5_3 needs backward computation.
I1213 ::09.296221 net.cpp:] relu5_2 needs backward computation.
I1213 ::09.297220 net.cpp:] conv5_2 needs backward computation.
I1213 ::09.297721 net.cpp:] relu5_1 needs backward computation.
I1213 ::09.298221 net.cpp:] conv5_1 needs backward computation.
I1213 ::09.298722 net.cpp:] pool4_pool4_0_split needs backward computation.
I1213 ::09.299222 net.cpp:] pool4 needs backward computation.
I1213 ::09.299722 net.cpp:] relu4_3 needs backward computation.
I1213 ::09.300222 net.cpp:] conv4_3 needs backward computation.
I1213 ::09.300724 net.cpp:] relu4_2 needs backward computation.
I1213 ::09.301223 net.cpp:] conv4_2 needs backward computation.
I1213 ::09.301725 net.cpp:] relu4_1 needs backward computation.
I1213 ::09.302224 net.cpp:] conv4_1 needs backward computation.
I1213 ::09.302724 net.cpp:] pool3_pool3_0_split needs backward computation.
I1213 ::09.303225 net.cpp:] pool3 needs backward computation.
I1213 ::09.303725 net.cpp:] relu3_3 needs backward computation.
I1213 ::09.304225 net.cpp:] conv3_3 needs backward computation.
I1213 ::09.305227 net.cpp:] relu3_2 needs backward computation.
I1213 ::09.305727 net.cpp:] conv3_2 needs backward computation.
I1213 ::09.306226 net.cpp:] relu3_1 needs backward computation.
I1213 ::09.306726 net.cpp:] conv3_1 needs backward computation.
I1213 ::09.307227 net.cpp:] pool2 needs backward computation.
I1213 ::09.309229 net.cpp:] relu2_2 needs backward computation.
I1213 ::09.310231 net.cpp:] conv2_2 needs backward computation.
I1213 ::09.310731 net.cpp:] relu2_1 needs backward computation.
I1213 ::09.311230 net.cpp:] conv2_1 needs backward computation.
I1213 ::09.311731 net.cpp:] pool1 needs backward computation.
I1213 ::09.312232 net.cpp:] relu1_2 needs backward computation.
I1213 ::09.312731 net.cpp:] conv1_2 needs backward computation.
I1213 ::09.313232 net.cpp:] relu1_1 needs backward computation.
I1213 ::09.313732 net.cpp:] conv1_1 needs backward computation.
I1213 ::09.314234 net.cpp:] label_label_0_split does not need backward computation.
I1213 ::09.314733 net.cpp:] label does not need backward computation.
I1213 ::09.315233 net.cpp:] data_data_0_split does not need backward computation.
I1213 ::09.315734 net.cpp:] data does not need backward computation.
I1213 ::09.316234 net.cpp:] This network produces output accuracy
I1213 ::09.316735 net.cpp:] This network produces output loss
I1213 ::09.317235 net.cpp:] Network initialization done.
I1213 ::09.318235 solver.cpp:] Solver scaffolding done.
I1213 ::09.320236 caffe.cpp:] Finetuning from fcn8s-heavy-pascal.caffemodel
I1213 ::13.756211 net.cpp:] Copying source layer data
I1213 ::13.756702 net.cpp:] Copying source layer data_data_0_split
I1213 ::13.757203 net.cpp:] Copying source layer conv1_1
I1213 ::13.757203 net.cpp:] Copying source layer relu1_1
I1213 ::13.757704 net.cpp:] Copying source layer conv1_2
I1213 ::13.757704 net.cpp:] Copying source layer relu1_2
I1213 ::13.758203 net.cpp:] Copying source layer pool1
I1213 ::13.758203 net.cpp:] Copying source layer conv2_1
I1213 ::13.758703 net.cpp:] Copying source layer relu2_1
I1213 ::13.758703 net.cpp:] Copying source layer conv2_2
I1213 ::13.759204 net.cpp:] Copying source layer relu2_2
I1213 ::13.759704 net.cpp:] Copying source layer pool2
I1213 ::13.759704 net.cpp:] Copying source layer conv3_1
I1213 ::13.760705 net.cpp:] Copying source layer relu3_1
I1213 ::13.760705 net.cpp:] Copying source layer conv3_2
I1213 ::13.762207 net.cpp:] Copying source layer relu3_2
I1213 ::13.762207 net.cpp:] Copying source layer conv3_3
I1213 ::13.763208 net.cpp:] Copying source layer relu3_3
I1213 ::13.763707 net.cpp:] Copying source layer pool3
I1213 ::13.763707 net.cpp:] Copying source layer pool3_pool3_0_split
I1213 ::13.764207 net.cpp:] Copying source layer conv4_1
I1213 ::13.766211 net.cpp:] Copying source layer relu4_1
I1213 ::13.767215 net.cpp:] Copying source layer conv4_2
I1213 ::13.771214 net.cpp:] Copying source layer relu4_2
I1213 ::13.771714 net.cpp:] Copying source layer conv4_3
I1213 ::13.774719 net.cpp:] Copying source layer relu4_3
I1213 ::13.775215 net.cpp:] Copying source layer pool4
I1213 ::13.775215 net.cpp:] Copying source layer pool4_pool4_0_split
I1213 ::13.775717 net.cpp:] Copying source layer conv5_1
I1213 ::13.779729 net.cpp:] Copying source layer relu5_1
I1213 ::13.779729 net.cpp:] Copying source layer conv5_2
I1213 ::13.784723 net.cpp:] Copying source layer relu5_2
I1213 ::13.784723 net.cpp:] Copying source layer conv5_3
I1213 ::13.789227 net.cpp:] Copying source layer relu5_3
I1213 ::13.789227 net.cpp:] Copying source layer pool5
I1213 ::13.789726 net.cpp:] Copying source layer fc6
I1213 ::13.927826 net.cpp:] Copying source layer relu6
I1213 ::13.928326 net.cpp:] Copying source layer drop6
I1213 ::13.928825 net.cpp:] Copying source layer fc7
I1213 ::13.949340 net.cpp:] Copying source layer relu7
I1213 ::13.949340 net.cpp:] Copying source layer drop7
I1213 ::13.949841 net.cpp:] Copying source layer score_fr
I1213 ::13.950340 net.cpp:] Copying source layer upscore2
I1213 ::13.950340 net.cpp:] Copying source layer upscore2_upscore2_0_split
I1213 ::13.950840 net.cpp:] Copying source layer score_pool4
I1213 ::13.950840 net.cpp:] Copying source layer score_pool4c
I1213 ::13.951341 net.cpp:] Copying source layer fuse_pool4
I1213 ::13.951341 net.cpp:] Copying source layer upscore_pool4
I1213 ::13.952844 net.cpp:] Copying source layer upscore_pool4_upscore_pool4_0_split
I1213 ::13.953343 net.cpp:] Copying source layer score_pool3
I1213 ::13.953842 net.cpp:] Copying source layer score_pool3c
I1213 ::13.953842 net.cpp:] Copying source layer fuse_pool3
I1213 ::13.954344 net.cpp:] Copying source layer upscore8
I1213 ::13.954843 net.cpp:] Copying source layer score
I1213 ::13.954843 net.cpp:] Copying source layer loss
I1213 ::14.854532 net.cpp:] Copying source layer data
I1213 ::14.855533 net.cpp:] Copying source layer data_data_0_split
I1213 ::14.856040 net.cpp:] Copying source layer conv1_1
I1213 ::14.856040 net.cpp:] Copying source layer relu1_1
I1213 ::14.856533 net.cpp:] Copying source layer conv1_2
I1213 ::14.857034 net.cpp:] Copying source layer relu1_2
I1213 ::14.857034 net.cpp:] Copying source layer pool1
I1213 ::14.857533 net.cpp:] Copying source layer conv2_1
I1213 ::14.857533 net.cpp:] Copying source layer relu2_1
I1213 ::14.858036 net.cpp:] Copying source layer conv2_2
I1213 ::14.858536 net.cpp:] Copying source layer relu2_2
I1213 ::14.858536 net.cpp:] Copying source layer pool2
I1213 ::14.858536 net.cpp:] Copying source layer conv3_1
I1213 ::14.859539 net.cpp:] Copying source layer relu3_1
I1213 ::14.859539 net.cpp:] Copying source layer conv3_2
I1213 ::14.860539 net.cpp:] Copying source layer relu3_2
I1213 ::14.860539 net.cpp:] Copying source layer conv3_3
I1213 ::14.861539 net.cpp:] Copying source layer relu3_3
I1213 ::14.861539 net.cpp:] Copying source layer pool3
I1213 ::14.863039 net.cpp:] Copying source layer pool3_pool3_0_split
I1213 ::14.864039 net.cpp:] Copying source layer conv4_1
I1213 ::14.865545 net.cpp:] Copying source layer relu4_1
I1213 ::14.866041 net.cpp:] Copying source layer conv4_2
I1213 ::14.869045 net.cpp:] Copying source layer relu4_2
I1213 ::14.869045 net.cpp:] Copying source layer conv4_3
I1213 ::14.873046 net.cpp:] Copying source layer relu4_3
I1213 ::14.873545 net.cpp:] Copying source layer pool4
I1213 ::14.874047 net.cpp:] Copying source layer pool4_pool4_0_split
I1213 ::14.875052 net.cpp:] Copying source layer conv5_1
I1213 ::14.878060 net.cpp:] Copying source layer relu5_1
I1213 ::14.878548 net.cpp:] Copying source layer conv5_2
I1213 ::14.882055 net.cpp:] Copying source layer relu5_2
I1213 ::14.883080 net.cpp:] Copying source layer conv5_3
I1213 ::14.886059 net.cpp:] Copying source layer relu5_3
I1213 ::14.886555 net.cpp:] Copying source layer pool5
I1213 ::14.887054 net.cpp:] Copying source layer fc6
I1213 ::15.006645 net.cpp:] Copying source layer relu6
I1213 ::15.006645 net.cpp:] Copying source layer drop6
I1213 ::15.007140 net.cpp:] Copying source layer fc7
I1213 ::15.030658 net.cpp:] Copying source layer relu7
I1213 ::15.031158 net.cpp:] Copying source layer drop7
I1213 ::15.032158 net.cpp:] Copying source layer score_fr
I1213 ::15.034660 net.cpp:] Copying source layer upscore2
I1213 ::15.035661 net.cpp:] Copying source layer upscore2_upscore2_0_split
I1213 ::15.036164 net.cpp:] Copying source layer score_pool4
I1213 ::15.036666 net.cpp:] Copying source layer score_pool4c
I1213 ::15.036666 net.cpp:] Copying source layer fuse_pool4
I1213 ::15.037163 net.cpp:] Copying source layer upscore_pool4
I1213 ::15.037163 net.cpp:] Copying source layer upscore_pool4_upscore_pool4_0_split
I1213 ::15.038663 net.cpp:] Copying source layer score_pool3
I1213 ::15.038663 net.cpp:] Copying source layer score_pool3c
I1213 ::15.039165 net.cpp:] Copying source layer fuse_pool3
I1213 ::15.039664 net.cpp:] Copying source layer upscore8
I1213 ::15.040163 net.cpp:] Copying source layer score
I1213 ::15.042166 net.cpp:] Copying source layer loss
I1213 ::15.088698 caffe.cpp:] Starting Optimization
I1213 ::15.089200 solver.cpp:] Solving
I1213 ::15.090199 solver.cpp:] Learning Rate Policy: fixed
I1213 ::15.136232 solver.cpp:] Iteration , Testing net (#)
I1213 ::26.184625 solver.cpp:] Test net output #: accuracy =
I1213 ::26.184625 solver.cpp:] Test net output #: loss = 5.40403 (* = 5.40403 loss)
I1213 ::26.345742 solver.cpp:] Iteration , loss = 10.3912
I1213 ::26.346242 solver.cpp:] Train net output #: loss = 10.3912 (* = 10.3912 loss)
I1213 ::26.346741 sgd_solver.cpp:] Iteration , lr = 1e-
I1213 ::36.910781 solver.cpp:] Iteration , loss = 2.21905
I1213 ::36.910781 solver.cpp:] Train net output #: loss = 2.21906 (* = 2.21906 loss)
I1213 ::36.911775 sgd_solver.cpp:] Iteration , lr = 1e-
I1213 ::47.624909 solver.cpp:] Iteration , loss = 4.72848
I1213 ::47.625411 solver.cpp:] Train net output #: loss = 4.72848 (* = 4.72848 loss)
I1213 ::47.625911 sgd_solver.cpp:] Iteration , lr = 1e-
I1213 ::58.325038 solver.cpp:] Iteration , loss = 2.64817
I1213 ::58.325539 solver.cpp:] Train net output #: loss = 2.64817 (* = 2.64817 loss)
I1213 ::58.329041 sgd_solver.cpp:] Iteration , lr = 1e-
I1213 ::08.973148 solver.cpp:] Iteration , loss = 2.92758
I1213 ::08.973649 solver.cpp:] Train net output #: loss = 2.92758 (* = 2.92758 loss)
I1213 ::08.973649 sgd_solver.cpp:] Iteration , lr = 1e-
I1213 ::19.647307 solver.cpp:] Iteration , loss = 2.62991
I1213 ::19.647807 solver.cpp:] Train net output #: loss = 2.62992 (* = 2.62992 loss)
I1213 ::19.648298 sgd_solver.cpp:] Iteration , lr = 1e-
I1213 ::30.322911 solver.cpp:] Iteration , loss = 4.65416
I1213 ::30.322911 solver.cpp:] Train net output #: loss = 4.65416 (* = 4.65416 loss)
I1213 ::30.323411 sgd_solver.cpp:] Iteration , lr = 1e-
I1213 ::40.963016 solver.cpp:] Iteration , loss = 4.19446
I1213 ::40.963515 solver.cpp:] Train net output #: loss = 4.19446 (* = 4.19446 loss)
I1213 ::40.964017 sgd_solver.cpp:] Iteration , lr = 1e-
I1213 ::51.605111 solver.cpp:] Iteration , loss = 3.29427
I1213 ::51.605111 solver.cpp:] Train net output #: loss = 3.29427 (* = 3.29427 loss)
I1213 ::51.605613 sgd_solver.cpp:] Iteration , lr = 1e-
I1213 ::02.237191 solver.cpp:] Iteration , loss = 5.40818
I1213 ::02.237690 solver.cpp:] Train net output #: loss = 5.40819 (* = 5.40819 loss)
I1213 ::02.237690 sgd_solver.cpp:] Iteration , lr = 1e-
I1213 ::12.995949 solver.cpp:] Iteration , loss = 7.47552
I1213 ::12.996439 solver.cpp:] Train net output #: loss = 7.47552 (* = 7.47552 loss)
I1213 ::12.996948 sgd_solver.cpp:] Iteration , lr = 1e-
I1213 ::23.705600 solver.cpp:] Iteration , loss = 5.60802
I1213 ::23.705600 solver.cpp:] Train net output #: loss = 5.60802 (* = 5.60802 loss)
I1213 ::23.706099 sgd_solver.cpp:] Iteration , lr = 1e-
I1213 ::34.293746 solver.cpp:] Iteration , loss = 2.95836
I1213 ::34.294245 solver.cpp:] Train net output #: loss = 2.95836 (* = 2.95836 loss)
I1213 ::34.294746 sgd_solver.cpp:] Iteration , lr = 1e-
I1213 ::44.911836 solver.cpp:] Iteration , loss = 3.27787
I1213 ::44.912346 solver.cpp:] Train net output #: loss = 3.27787 (* = 3.27787 loss)
I1213 ::44.912838 sgd_solver.cpp:] Iteration , lr = 1e-
I1213 ::55.712539 solver.cpp:] Iteration , loss = 3.04385
I1213 ::55.713040 solver.cpp:] Train net output #: loss = 3.04386 (* = 3.04386 loss)
I1213 ::55.713040 sgd_solver.cpp:] Iteration , lr = 1e-
I1213 ::06.361131 solver.cpp:] Iteration , loss = 3.75074
I1213 ::06.361637 solver.cpp:] Train net output #: loss = 3.75075 (* = 3.75075 loss)
I1213 ::06.362130 sgd_solver.cpp:] Iteration , lr = 1e-
I1213 ::17.130873 solver.cpp:] Iteration , loss = 2.53425
I1213 ::17.131374 solver.cpp:] Train net output #: loss = 2.53425 (* = 2.53425 loss)
I1213 ::17.131873 sgd_solver.cpp:] Iteration , lr = 1e-
I1213 ::26.238867 solver.cpp:] Iteration , Testing net (#)
I1213 ::37.551930 solver.cpp:] Test net output #: accuracy =
I1213 ::37.552439 solver.cpp:] Test net output #: loss = 5.40402 (* = 5.40402 loss)
I1213 ::38.718762 solver.cpp:] Iteration , loss = 3.97111
I1213 ::38.718762 solver.cpp:] Train net output #: loss = 3.97111 (* = 3.97111 loss)
I1213 ::38.719262 sgd_solver.cpp:] Iteration , lr = 1e-
I1213 ::49.461891 solver.cpp:] Iteration , loss = 3.32952
I1213 ::49.462391 solver.cpp:] Train net output #: loss = 3.32952 (* = 3.32952 loss)
I1213 ::49.462893 sgd_solver.cpp:] Iteration , lr = 1e-
I1213 ::00.222075 solver.cpp:] Iteration , loss = 5.11817

图像分割实验:FCN数据集制作,网络模型定义,网络训练(提供数据集和模型文件,以供参考)的更多相关文章

  1. Kubernetes学习之路(二十一)之网络模型和网络策略

    目录 Kubernetes的网络模型和网络策略 1.Kubernetes网络模型和CNI插件 1.1.Docker网络模型 1.2.Kubernetes网络模型 1.3.Flannel网络插件 1.4 ...

  2. 基于深度学习和迁移学习的识花实践——利用 VGG16 的深度网络结构中的五轮卷积网络层和池化层,对每张图片得到一个 4096 维的特征向量,然后我们直接用这个特征向量替代原来的图片,再加若干层全连接的神经网络,对花朵数据集进行训练(属于模型迁移)

    基于深度学习和迁移学习的识花实践(转)   深度学习是人工智能领域近年来最火热的话题之一,但是对于个人来说,以往想要玩转深度学习除了要具备高超的编程技巧,还需要有海量的数据和强劲的硬件.不过 Tens ...

  3. OSI网络模型和网络连接设备

    OSI网络模型和网络连接设备 OSI模型 7层之间传输的协议传输单元(PDU)的专业叫法. 第7-5层(应用层)传输的pdu叫:data 第4层(传输层)传输的pdu叫:segment(数据段) 第3 ...

  4. 第三十二节,使用谷歌Object Detection API进行目标检测、训练新的模型(使用VOC 2012数据集)

    前面已经介绍了几种经典的目标检测算法,光学习理论不实践的效果并不大,这里我们使用谷歌的开源框架来实现目标检测.至于为什么不去自己实现呢?主要是因为自己实现比较麻烦,而且调参比较麻烦,我们直接利用别人的 ...

  5. 卷积网络训练太慢?Yann LeCun:已解决CIFAR-10,目标 ImageNet

    原文连接:http://blog.kaggle.com/2014/12/22/convolutional-nets-and-cifar-10-an-interview-with-yan-lecun/ ...

  6. MINIST深度学习识别:python全连接神经网络和pytorch LeNet CNN网络训练实现及比较(三)

    版权声明:本文为博主原创文章,欢迎转载,并请注明出处.联系方式:460356155@qq.com 在前两篇文章MINIST深度学习识别:python全连接神经网络和pytorch LeNet CNN网 ...

  7. 基于UML网络教学管理平台模型的搭建

    一.基本信息 标题:基于UML网络教学管理平台模型的搭建 时间:2013 出版源:网络安全技术与应用 领域分类:UML:网络教学管理平台:模型 二.研究背景 问题定义:网络教学管理平台模型的搭建 难点 ...

  8. [2] LabelImg图片标注 与 YOLOv3 网络训练 (待补充)

    LabelImg是一个图形图像注释工具. 它是用Python编写的,并使用Qt作为其图形界面. 注释以PASCAL VOC格式保存为XML文件,这是ImageNet使用的格式.Besdies,它也支持 ...

  9. 【猫狗数据集】使用预训练的resnet18模型

    数据集下载地址: 链接:https://pan.baidu.com/s/1l1AnBgkAAEhh0vI5_loWKw提取码:2xq4 创建数据集:https://www.cnblogs.com/xi ...

随机推荐

  1. iOS7下隐藏status bar的详细研究

    info.plist文件中,View controller-based status bar appearance项设为YES,则View controller对status bar的设置优先级高于a ...

  2. C语言程序设计第七次作业

    一.学习内容     本次课学习了函数的基本知识,需要大家对如下知识点进行总结:     1. 函数定义的基本格式,函数定义和函数原型(声明)的区别何在?     2. 函数的调用方式有哪几种     ...

  3. 【转】 数据库系统——B+树索引

    原文来自于:http://blog.csdn.net/cjfeii/article/details/10858721 1. B+树索引概述 在上一篇文章中,我们讨论了关于index的几个中重要的课题: ...

  4. Cardinal样条曲线的Javascript实现(代码篇)

    由上一篇文章得到了Cardinal曲线的矩阵表达式,下面就这个矩阵表达式就可以来对曲线进行插值了. 这里选用了JS来实现,完全是因为之前交作业的时候还不知道怎么在Xcode里建完整的C++OpenGL ...

  5. IOS百度地图获取所在的城市名称

    笔者的app要实现定位所在省和城市名称,借此总结巩固一下! @interface VenueListVC : BasePageTableViewVC<BMKLocationServiceDele ...

  6. Docker简明教程

    Docker简明教程 [编者的话]使用Docker来写代码更高效并能有效提升自己的技能.Docker能打包你的开发环境,消除包的依赖冲突,并通过集装箱式的应用来减少开发时间和学习时间. Docker作 ...

  7. foreach循环 Java

    第一次遇到foreach循环,是在PHP的数组中,同样,在Java数组中,也遇到了foreach循环,都是用来遍历数组(集合).遍历数组,首先想到的一般都是用while,do while,for循环, ...

  8. [转]jq选择器

    jQuery-强大的jQuery选择器 (详解)[转] 1. 基础选择器 Basics 名称 说明 举例 #id 根据元素Id选择 $("divId") 选择ID为divId的元素 ...

  9. javascript常用数组算法总结

    1.数组去重 方法1: JavaScript //利用数组的indexOf方法 function unique (arr) { var result = []; for (var i = 0; i & ...

  10. 命令行模式 svn版本管理

    linux 下svn 在命令行模式下的操作安装完svn服务并配置了环境变量之后,要创建一个存放工厂(项目)的仓库repositories用于版本控制(比如我的repositories的路径为 path ...