一、简单介绍

vgg和googlenet是2014年imagenet竞赛的双雄,这两类模型结构有一个共同特点是go deeper。跟googlenet不同的是。vgg继承了lenet以及alexnet的一些框架。尤其是跟alexnet框架很像。vgg也是5个group的卷积、2层fc图像特征、一层fc分类特征,能够看做和alexnet一样总共8个part。依据前5个卷积group。每一个group中的不同配置,vgg论文中给出了A~E这五种配置。卷积层数从8到16递增。

从论文中能够看到从8到16随着卷积层的一步步加深,貌似通过加深卷积层数也已经到达准确率提升的瓶颈了。后面的有一些论文针对卷积层输入的前处理(比如batch
normalization)和输出的后处理(比如prelu)做了研究。再进一步的提升方向会是什么呢?这个值得大家去深入思考。

二、网络分析

我依据http://cs.stanford.edu/people/karpathy/vgg_train_val.prototxt配置文件以及vgg论文指导改动得到了vgg_A网络结构。

在改动的过程中你会发现vgg为了做不同深度网络之间的对照,然后又不至于太多的改动网络。vgg中给全部的卷积层以及pool层都设置了一样的层操作參数,确保了每一个group出来的shape都是一致的。无论你在卷积group中加多少层的卷积。

三、网络日志

下面是详细的shape log:

i0701 17:01:10.548092 26739 data_layer.cpp:85] output data size: 100,3,224,224
i0701 17:01:10.736845 26739 net.cpp:206] top shape: 100 3 224 224 (15052800)
i0701 17:01:10.736912 26739 net.cpp:206] top shape: 100 (100)
i0701 17:01:10.736929 26739 layer_factory.hpp:75] creating layer conv1_1
i0701 17:01:10.736968 26739 net.cpp:166] creating layer conv1_1
i0701 17:01:10.736979 26739 net.cpp:496] conv1_1 <- data
i0701 17:01:10.737004 26739 net.cpp:452] conv1_1 -> conv1_1
i0701 17:01:10.737030 26739 net.cpp:197] setting up conv1_1
i0701 17:01:10.738733 26739 net.cpp:206] top shape: 100 64 224 224 (321126400)
i0701 17:01:10.738770 26739 layer_factory.hpp:75] creating layer relu1_1
i0701 17:01:10.738786 26739 net.cpp:166] creating layer relu1_1
i0701 17:01:10.738824 26739 net.cpp:496] relu1_1 <- conv1_1
i0701 17:01:10.738838 26739 net.cpp:439] relu1_1 -> conv1_1 (in-place)
i0701 17:01:10.738853 26739 net.cpp:197] setting up relu1_1
i0701 17:01:10.738867 26739 net.cpp:206] top shape: 100 64 224 224 (321126400)
i0701 17:01:10.738878 26739 layer_factory.hpp:75] creating layer pool1
i0701 17:01:10.738890 26739 net.cpp:166] creating layer pool1
i0701 17:01:10.738900 26739 net.cpp:496] pool1 <- conv1_1
i0701 17:01:10.738914 26739 net.cpp:452] pool1 -> pool1
i0701 17:01:10.738930 26739 net.cpp:197] setting up pool1
i0701 17:01:10.738963 26739 net.cpp:206] top shape: 100 64 112 112 (80281600)
i0701 17:01:10.738975 26739 layer_factory.hpp:75] creating layer conv2_1
i0701 17:01:10.738992 26739 net.cpp:166] creating layer conv2_1
i0701 17:01:10.739001 26739 net.cpp:496] conv2_1 <- pool1
i0701 17:01:10.739017 26739 net.cpp:452] conv2_1 -> conv2_1
i0701 17:01:10.739030 26739 net.cpp:197] setting up conv2_1
i0701 17:01:10.746640 26739 net.cpp:206] top shape: 100 128 112 112 (160563200)
i0701 17:01:10.746669 26739 layer_factory.hpp:75] creating layer relu2_1
i0701 17:01:10.746682 26739 net.cpp:166] creating layer relu2_1
i0701 17:01:10.746691 26739 net.cpp:496] relu2_1 <- conv2_1
i0701 17:01:10.746702 26739 net.cpp:439] relu2_1 -> conv2_1 (in-place)
i0701 17:01:10.746714 26739 net.cpp:197] setting up relu2_1
i0701 17:01:10.746726 26739 net.cpp:206] top shape: 100 128 112 112 (160563200)
i0701 17:01:10.746734 26739 layer_factory.hpp:75] creating layer pool2
i0701 17:01:10.746749 26739 net.cpp:166] creating layer pool2
i0701 17:01:10.746759 26739 net.cpp:496] pool2 <- conv2_1
i0701 17:01:10.746770 26739 net.cpp:452] pool2 -> pool2
i0701 17:01:10.746783 26739 net.cpp:197] setting up pool2
i0701 17:01:10.746798 26739 net.cpp:206] top shape: 100 128 56 56 (40140800)
i0701 17:01:10.746809 26739 layer_factory.hpp:75] creating layer conv3_1
i0701 17:01:10.746809 26739 layer_factory.hpp:75] creating layer conv3_1
i0701 17:01:10.746825 26739 net.cpp:166] creating layer conv3_1
i0701 17:01:10.746835 26739 net.cpp:496] conv3_1 <- pool2
i0701 17:01:10.746846 26739 net.cpp:452] conv3_1 -> conv3_1
i0701 17:01:10.746860 26739 net.cpp:197] setting up conv3_1
i0701 17:01:10.747910 26739 net.cpp:206] top shape: 100 256 56 56 (80281600)
i0701 17:01:10.747939 26739 layer_factory.hpp:75] creating layer relu3_1
i0701 17:01:10.747954 26739 net.cpp:166] creating layer relu3_1
i0701 17:01:10.747963 26739 net.cpp:496] relu3_1 <- conv3_1
i0701 17:01:10.747974 26739 net.cpp:439] relu3_1 -> conv3_1 (in-place)
i0701 17:01:10.747985 26739 net.cpp:197] setting up relu3_1
i0701 17:01:10.747997 26739 net.cpp:206] top shape: 100 256 56 56 (80281600)
i0701 17:01:10.748009 26739 layer_factory.hpp:75] creating layer conv3_2
i0701 17:01:10.748021 26739 net.cpp:166] creating layer conv3_2
i0701 17:01:10.748030 26739 net.cpp:496] conv3_2 <- conv3_1
i0701 17:01:10.748045 26739 net.cpp:452] conv3_2 -> conv3_2
i0701 17:01:10.748060 26739 net.cpp:197] setting up conv3_2
i0701 17:01:10.750586 26739 net.cpp:206] top shape: 100 256 56 56 (80281600)
i0701 17:01:10.750610 26739 layer_factory.hpp:75] creating layer relu3_2
i0701 17:01:10.750624 26739 net.cpp:166] creating layer relu3_2
i0701 17:01:10.750635 26739 net.cpp:496] relu3_2 <- conv3_2
i0701 17:01:10.750648 26739 net.cpp:439] relu3_2 -> conv3_2 (in-place)
i0701 17:01:10.750669 26739 net.cpp:197] setting up relu3_2
i0701 17:01:10.750681 26739 net.cpp:206] top shape: 100 256 56 56 (80281600)
i0701 17:01:10.750690 26739 layer_factory.hpp:75] creating layer pool3
i0701 17:01:10.750702 26739 net.cpp:166] creating layer pool3
i0701 17:01:10.750710 26739 net.cpp:496] pool3 <- conv3_2
i0701 17:01:10.750725 26739 net.cpp:452] pool3 -> pool3
i0701 17:01:10.750740 26739 net.cpp:197] setting up pool3
i0701 17:01:10.750756 26739 net.cpp:206] top shape: 100 256 28 28 (20070400)
i0701 17:01:10.750764 26739 layer_factory.hpp:75] creating layer conv4_1
i0701 17:01:10.750779 26739 net.cpp:166] creating layer conv4_1
i0701 17:01:10.750788 26739 net.cpp:496] conv4_1 <- pool3
i0701 17:01:10.750800 26739 net.cpp:452] conv4_1 -> conv4_1
i0701 17:01:10.750825 26739 net.cpp:197] setting up conv4_1
i0701 17:01:10.756436 26739 net.cpp:206] top shape: 100 512 28 28 (40140800)
i0701 17:01:10.756474 26739 layer_factory.hpp:75] creating layer relu4_1
i0701 17:01:10.756489 26739 net.cpp:166] creating layer relu4_1
i0701 17:01:10.756499 26739 net.cpp:496] relu4_1 <- conv4_1
i0701 17:01:10.756510 26739 net.cpp:439] relu4_1 -> conv4_1 (in-place)
i0701 17:01:10.756523 26739 net.cpp:197] setting up relu4_1
i0701 17:01:10.756536 26739 net.cpp:206] top shape: 100 512 28 28 (40140800)
i0701 17:01:10.756546 26739 layer_factory.hpp:75] creating layer conv4_2
i0701 17:01:10.756559 26739 net.cpp:166] creating layer conv4_2
i0701 17:01:10.756568 26739 net.cpp:496] conv4_2 <- conv4_1
i0701 17:01:10.756583 26739 net.cpp:452] conv4_2 -> conv4_2
i0701 17:01:10.756597 26739 net.cpp:197] setting up conv4_2
i0701 17:01:10.766434 26739 net.cpp:206] top shape: 100 512 28 28 (40140800)
i0701 17:01:10.766474 26739 layer_factory.hpp:75] creating layer relu4_2
i0701 17:01:10.766490 26739 net.cpp:166] creating layer relu4_2
i0701 17:01:10.766500 26739 net.cpp:496] relu4_2 <- conv4_2
i0701 17:01:10.766513 26739 net.cpp:439] relu4_2 -> conv4_2 (in-place)
i0701 17:01:10.766531 26739 net.cpp:197] setting up relu4_2
i0701 17:01:10.766543 26739 net.cpp:206] top shape: 100 512 28 28 (40140800)
i0701 17:01:10.766552 26739 layer_factory.hpp:75] creating layer pool4
i0701 17:01:10.766573 26739 net.cpp:166] creating layer pool4
i0701 17:01:10.766582 26739 net.cpp:496] pool4 <- conv4_2
i0701 17:01:10.766595 26739 net.cpp:452] pool4 -> pool4
i0701 17:01:10.766608 26739 net.cpp:197] setting up pool4
i0701 17:01:10.766624 26739 net.cpp:206] top shape: 100 512 14 14 (10035200)
I0701 17:18:56.158187 29940 layer_factory.hpp:75] Creating layer conv5_1
I0701 17:18:56.158201 29940 net.cpp:166] Creating Layer conv5_1
I0701 17:18:56.158210 29940 net.cpp:496] conv5_1 <- pool4
I0701 17:18:56.158222 29940 net.cpp:452] conv5_1 -> conv5_1
I0701 17:18:56.158234 29940 net.cpp:197] Setting up conv5_1
I0701 17:18:56.168265 29940 net.cpp:206] Top shape: 100 512 14 14 (10035200)
I0701 17:18:56.168303 29940 layer_factory.hpp:75] Creating layer relu5_1
I0701 17:18:56.168320 29940 net.cpp:166] Creating Layer relu5_1
I0701 17:18:56.168329 29940 net.cpp:496] relu5_1 <- conv5_1
I0701 17:18:56.168341 29940 net.cpp:439] relu5_1 -> conv5_1 (in-place)
I0701 17:18:56.168355 29940 net.cpp:197] Setting up relu5_1
I0701 17:18:56.168368 29940 net.cpp:206] Top shape: 100 512 14 14 (10035200)
I0701 17:18:56.168378 29940 layer_factory.hpp:75] Creating layer conv5_2
I0701 17:18:56.168395 29940 net.cpp:166] Creating Layer conv5_2
I0701 17:18:56.168406 29940 net.cpp:496] conv5_2 <- conv5_1
I0701 17:18:56.168417 29940 net.cpp:452] conv5_2 -> conv5_2
I0701 17:18:56.168431 29940 net.cpp:197] Setting up conv5_2
I0701 17:18:56.178441 29940 net.cpp:206] Top shape: 100 512 14 14 (10035200)
I0701 17:18:56.178478 29940 layer_factory.hpp:75] Creating layer relu5_2
I0701 17:18:56.178493 29940 net.cpp:166] Creating Layer relu5_2
I0701 17:18:56.178501 29940 net.cpp:496] relu5_2 <- conv5_2
I0701 17:18:56.178514 29940 net.cpp:439] relu5_2 -> conv5_2 (in-place)
I0701 17:18:56.178527 29940 net.cpp:197] Setting up relu5_2
I0701 17:18:56.178539 29940 net.cpp:206] Top shape: 100 512 14 14 (10035200)
I0701 17:18:56.178547 29940 layer_factory.hpp:75] Creating layer pool5
I0701 17:18:56.178561 29940 net.cpp:166] Creating Layer pool5
I0701 17:18:56.178570 29940 net.cpp:496] pool5 <- conv5_2
I0701 17:18:56.178581 29940 net.cpp:452] pool5 -> pool5
I0701 17:18:56.178596 29940 net.cpp:197] Setting up pool5
I0701 17:18:56.178611 29940 net.cpp:206] Top shape: 100 512 7 7 (2508800)
I0701 17:18:56.178621 29940 layer_factory.hpp:75] Creating layer fc6
i0701 17:01:10.796613 26739 net.cpp:166] creating layer fc6
i0701 17:01:10.796622 26739 net.cpp:496] fc6 <- pool5
i0701 17:01:10.796634 26739 net.cpp:452] fc6 -> fc6
i0701 17:01:10.796650 26739 net.cpp:197] setting up fc6
i0701 17:01:11.236284 26739 net.cpp:206] top shape: 100 4096 (409600)
i0701 17:01:11.236351 26739 layer_factory.hpp:75] creating layer relu6
i0701 17:01:11.236373 26739 net.cpp:166] creating layer relu6
i0701 17:01:11.236384 26739 net.cpp:496] relu6 <- fc6
i0701 17:01:11.236404 26739 net.cpp:439] relu6 -> fc6 (in-place)
i0701 17:01:11.236423 26739 net.cpp:197] setting up relu6
i0701 17:01:11.236435 26739 net.cpp:206] top shape: 100 4096 (409600)
i0701 17:01:11.236444 26739 layer_factory.hpp:75] creating layer drop6
i0701 17:01:11.236464 26739 net.cpp:166] creating layer drop6
i0701 17:01:11.236472 26739 net.cpp:496] drop6 <- fc6
i0701 17:01:11.236486 26739 net.cpp:439] drop6 -> fc6 (in-place)
i0701 17:01:11.236500 26739 net.cpp:197] setting up drop6
i0701 17:01:11.236524 26739 net.cpp:206] top shape: 100 4096 (409600)
i0701 17:01:11.236534 26739 layer_factory.hpp:75] creating layer fc7
i0701 17:01:11.236549 26739 net.cpp:166] creating layer fc7
i0701 17:01:11.236557 26739 net.cpp:496] fc7 <- fc6
i0701 17:01:11.236569 26739 net.cpp:452] fc7 -> fc7
i0701 17:01:11.236585 26739 net.cpp:197] setting up fc7
i0701 17:01:11.301771 26739 net.cpp:206] top shape: 100 4096 (409600)
i0701 17:01:11.301842 26739 layer_factory.hpp:75] creating layer relu7
i0701 17:01:11.301864 26739 net.cpp:166] creating layer relu7
i0701 17:01:11.301877 26739 net.cpp:496] relu7 <- fc7
i0701 17:01:11.301898 26739 net.cpp:439] relu7 -> fc7 (in-place)
i0701 17:01:11.301916 26739 net.cpp:197] setting up relu7
i0701 17:01:11.301929 26739 net.cpp:206] top shape: 100 4096 (409600)
i0701 17:01:11.301939 26739 layer_factory.hpp:75] creating layer drop7
i0701 17:01:11.301954 26739 net.cpp:166] creating layer drop7
i0701 17:01:11.301962 26739 net.cpp:496] drop7 <- fc7
i0701 17:01:11.301972 26739 net.cpp:439] drop7 -> fc7 (in-place)
i0701 17:01:11.301985 26739 net.cpp:197] setting up drop7
i0701 17:01:11.302000 26739 net.cpp:206] top shape: 100 4096 (409600)
i0701 17:01:11.302008 26739 layer_factory.hpp:75] creating layer fc8
i0701 17:01:11.302023 26739 net.cpp:166] creating layer fc8
i0701 17:01:11.302032 26739 net.cpp:496] fc8 <- fc7
i0701 17:01:11.302044 26739 net.cpp:452] fc8 -> fc8
i0701 17:01:11.302058 26739 net.cpp:197] setting up fc8
i0701 17:01:11.464764 26739 net.cpp:206] top shape: 100 10000 (1000000)

shape的计算方式能够參考[caffe]深度学习之图像分类模型AlexNet解读

[caffe]深度学习之图像分类模型VGG解读的更多相关文章

  1. 【转】[caffe]深度学习之图像分类模型AlexNet解读

    [caffe]深度学习之图像分类模型AlexNet解读 原文地址:http://blog.csdn.net/sunbaigui/article/details/39938097   本文章已收录于: ...

  2. [caffe]深度学习之图像分类模型AlexNet解读

    在imagenet上的图像分类challenge上Alex提出的alexnet网络结构模型赢得了2012届的冠军.要研究CNN类型DL网络模型在图像分类上的应用,就逃不开研究alexnet.这是CNN ...

  3. Caffe 深度学习框架上手教程

    Caffe 深度学习框架上手教程   blink 15年1月   Caffe (CNN, deep learning) 介绍 Caffe -----------Convolution Architec ...

  4. [转]Caffe 深度学习框架上手教程

    Caffe 深度学习框架上手教程 机器学习Caffe caffe 原文地址:http://suanfazu.com/t/caffe/281   blink 15年1月 6   Caffe448是一个清 ...

  5. supervessel-免费云镜像︱GPU加速的Caffe深度学习开发环境

    开发环境介绍 在SuperVessel云上,我们为大家免费提供当前火热的caffe深度学习开发环境.SuperVessel的Caffe有如下优点: 1) 免去了繁琐的Caffe环境的安装配置,即申请即 ...

  6. 深度学习的seq2seq模型——本质是LSTM,训练过程是使得所有样本的p(y1,...,yT‘|x1,...,xT)概率之和最大

    from:https://baijiahao.baidu.com/s?id=1584177164196579663&wfr=spider&for=pc seq2seq模型是以编码(En ...

  7. 深度学习 vs. 概率图模型 vs. 逻辑学

    深度学习 vs. 概率图模型 vs. 逻辑学 摘要:本文回顾过去50年人工智能(AI)领域形成的三大范式:逻辑学.概率方法和深度学习.文章按时间顺序展开,先回顾逻辑学和概率图方法,然后就人工智能和机器 ...

  8. 时间序列深度学习:seq2seq 模型预测太阳黑子

    目录 时间序列深度学习:seq2seq 模型预测太阳黑子 学习路线 商业中的时间序列深度学习 商业中应用时间序列深度学习 深度学习时间序列预测:使用 keras 预测太阳黑子 递归神经网络 设置.预处 ...

  9. Ubuntu 14.04 安装caffe深度学习框架

    简介:如何在ubuntu 14.04 下安装caffe深度学习框架. 注:安装caffe时一定要保持网络状态好,不然会遇到很多麻烦.例如下载不了,各种报错. 一.安装依赖包 $ sudo apt-ge ...

随机推荐

  1. Codeforces Round #236 (Div. 2)

    A. Nuts time limit per test:1 secondmemory limit per test:256 megabytesinput:standard inputoutput:st ...

  2. 【Uva 12558】 Egyptian Fractions (HARD version) (迭代加深搜,IDA*)

    IDA* 就是iterative deepening(迭代深搜)+A*(启发式搜索) 启发式搜索就是设计估价函数进行的搜索(可以减很多枝哦~) 这题... 理论上可以回溯,但是解答树非常恐怖,深度没有 ...

  3. 进程间通信(IPC) 简介

    IPC是进程间通信的简称.传统上该术语描述的是运行在某个操作系统之上的不同进程间消息传递的不同方式. 我们讨论分为四个领域: 消息传递(管道,FIFO,消息队列(system v消息队列,posix消 ...

  4. JNI 多线程

    一.概述 JNI编程和Linux上的C/C++编程还是挺相似的,每次java调用JNI中的函数时都会传入有关JVM的一些参数(如JNIEnv,jobject),每次JNI回调java中的方法时都要通过 ...

  5. [转贴]使用CryptoAPI解析X509证书和P12证书

    原文在 http://bbs.pediy.com/archive/index.php?t-97663.html,但是觉得这篇文章非常好,我抄下来作我笔记用 一.解析X509证书 1.从磁盘上的证书文件 ...

  6. 展讯CEO:低毛利生存 由中低端转向高端

    最近一两年来,芯片市场的热闹有从细分.垂直的圈子向整个大社会场景发酵的迹象. 备受各界关注的高通发垄断案,国家大基金的成立,以及展讯.锐迪科等私有化等等,都意味着这个行业的热度在快速上升.这里面既有芯 ...

  7. 正确使用c语言中的头文件

    我们在使用c编程的时候经常会遇到头文件,前段时间我自己做了个小项目的时候,也遇到了关于头文件的问题. 预处理器发现#include 指令后,就会寻找后跟的文件名并把这个文件包含的内容包含到当前文件中. ...

  8. matlab numpy equivalents

    THIS IS AN EVOLVING WIKI DOCUMENT. If you find an error, or can fill in an empty box, please fix it! ...

  9. 海量数据的二度人脉挖掘算法(Hadoop 实现)

    最近做了一个项目,要求找出二度人脉的一些关系,就好似新浪微博的“你可能感兴趣的人” 中,间接关注推荐:简单描述:即你关注的人中有N个人同时都关注了 XXX . 在程序的实现上,其实我们要找的是:若 U ...

  10. UpdatePanel 无刷新弹出窗口

    UpdatePanel下解决提示框不弹出的方法 用户体验上既想页面不刷新,也希望同时能够看到操作的效果(弹出提示框)! ①不刷新,我们可以使用UpdatePanel ②弹出消息框,这个有很多的方式:我 ...