(原)torch的apply函数
转载请注明出处:
http://www.cnblogs.com/darkknightzh/p/6221633.html
torch中的apply函数通过可以不断遍历model的各个模块。实际上其使用的是深度优先算法。
其具体代码如下所示(代码见torch/install/share/lua/5.1/nn/Module.lua):
-- Run a callback (called with the module as an argument) in preorder over this
-- module and its children.
--
function Module:apply(callback)
callback(self) if self.modules then
for _, module in ipairs(self.modules) do
module:apply(callback)
end
end
end
可见,apply递归调用自身,直到不存在模块为止(这样说不太合理)。
如下所示的测试代码:
require "dpnn" function createModel()
local net = nn.Sequential() net:add(nn.SpatialConvolutionMM(, , , , , , , ))
net:add(nn.SpatialBatchNormalization())
net:add(nn.ReLU())
net:add(nn.SpatialMaxPooling(, , , , , )) net:add(nn.Inception{
inputSize = ,
kernelSize = {, },
kernelStride = {, },
outputSize = {, },
reduceSize = {, , , },
pool = nn.SpatialMaxPooling(, , , , , ),
batchNorm = true
}) net:add(nn.Inception{
inputSize = ,
kernelSize = {, },
kernelStride = {, },
outputSize = {, },
reduceSize = {, , , },
pool = nn.SpatialLPPooling(, , , , , ),
batchNorm = false
}) net:add(nn.SpatialAveragePooling(, ))
net:add(nn.View())
net:add(nn.Linear(, ))
net:add(nn.Normalize()) return net
end torch.setdefaulttensortype('torch.FloatTensor') local model = createModel() --print(model)
tt =
model:apply(function(module)
tt = tt +
print(tt, module)
end)
其输出结果为:
1 nn.Sequential {
[input -> (1) -> (2) -> (3) -> (4) -> (5) -> (6) -> (7) -> (8) -> (9) -> (10) -> output]
(1): nn.SpatialConvolutionMM(3 -> 64, 7x7, 2,2, 3,3)
(2): nn.SpatialBatchNormalization
(3): nn.ReLU
(4): nn.SpatialMaxPooling(3x3, 2,2, 1,1)
(5): nn.Inception @ nn.DepthConcat {
input
|`-> (1): nn.Sequential {
| [input -> (1) -> (2) -> (3) -> (4) -> (5) -> (6) -> output]
| (1): nn.SpatialConvolution(192 -> 96, 1x1)
| (2): nn.SpatialBatchNormalization
| (3): nn.ReLU
| (4): nn.SpatialConvolution(96 -> 128, 3x3, 1,1, 1,1)
| (5): nn.SpatialBatchNormalization
| (6): nn.ReLU
| }
|`-> (2): nn.Sequential {
| [input -> (1) -> (2) -> (3) -> (4) -> (5) -> (6) -> output]
| (1): nn.SpatialConvolution(192 -> 16, 1x1)
| (2): nn.SpatialBatchNormalization
| (3): nn.ReLU
| (4): nn.SpatialConvolution(16 -> 32, 5x5, 1,1, 2,2)
| (5): nn.SpatialBatchNormalization
| (6): nn.ReLU
| }
|`-> (3): nn.Sequential {
| [input -> (1) -> (2) -> (3) -> (4) -> output]
| (1): nn.SpatialMaxPooling(3x3, 1,1, 1,1)
| (2): nn.SpatialConvolution(192 -> 32, 1x1)
| (3): nn.SpatialBatchNormalization
| (4): nn.ReLU
| }
|`-> (4): nn.Sequential {
[input -> (1) -> (2) -> (3) -> output]
(1): nn.SpatialConvolution(192 -> 64, 1x1)
(2): nn.SpatialBatchNormalization
(3): nn.ReLU
}
... -> output
}
(6): nn.Inception @ nn.DepthConcat {
input
|`-> (1): nn.Sequential {
| [input -> (1) -> (2) -> (3) -> (4) -> output]
| (1): nn.SpatialConvolution(256 -> 96, 1x1)
| (2): nn.ReLU
| (3): nn.SpatialConvolution(96 -> 128, 3x3, 1,1, 1,1)
| (4): nn.ReLU
| }
|`-> (2): nn.Sequential {
| [input -> (1) -> (2) -> (3) -> (4) -> output]
| (1): nn.SpatialConvolution(256 -> 32, 1x1)
| (2): nn.ReLU
| (3): nn.SpatialConvolution(32 -> 64, 5x5, 1,1, 2,2)
| (4): nn.ReLU
| }
|`-> (3): nn.Sequential {
| [input -> (1) -> (2) -> (3) -> output]
| (1): nn.Sequential {
| [input -> (1) -> (2) -> (3) -> (4) -> output]
| (1): nn.Square
| (2): nn.SpatialAveragePooling(3x3, 1,1)
| (3): nn.MulConstant
| (4): nn.Sqrt
| }
| (2): nn.SpatialConvolution(256 -> 64, 1x1)
| (3): nn.ReLU
| }
|`-> (4): nn.Sequential {
[input -> (1) -> (2) -> output]
(1): nn.SpatialConvolution(256 -> 64, 1x1)
(2): nn.ReLU
}
... -> output
}
(7): nn.SpatialAveragePooling(7x7, 1,1)
(8): nn.View(320)
(9): nn.Linear(320 -> 128)
(10): nn.Normalize(2)
}
2 nn.SpatialConvolutionMM(3 -> 64, 7x7, 2,2, 3,3)
3 nn.SpatialBatchNormalization
4 nn.ReLU
5 nn.SpatialMaxPooling(3x3, 2,2, 1,1)
6 nn.Inception @ nn.DepthConcat {
input
|`-> (1): nn.Sequential {
| [input -> (1) -> (2) -> (3) -> (4) -> (5) -> (6) -> output]
| (1): nn.SpatialConvolution(192 -> 96, 1x1)
| (2): nn.SpatialBatchNormalization
| (3): nn.ReLU
| (4): nn.SpatialConvolution(96 -> 128, 3x3, 1,1, 1,1)
| (5): nn.SpatialBatchNormalization
| (6): nn.ReLU
| }
|`-> (2): nn.Sequential {
| [input -> (1) -> (2) -> (3) -> (4) -> (5) -> (6) -> output]
| (1): nn.SpatialConvolution(192 -> 16, 1x1)
| (2): nn.SpatialBatchNormalization
| (3): nn.ReLU
| (4): nn.SpatialConvolution(16 -> 32, 5x5, 1,1, 2,2)
| (5): nn.SpatialBatchNormalization
| (6): nn.ReLU
| }
|`-> (3): nn.Sequential {
| [input -> (1) -> (2) -> (3) -> (4) -> output]
| (1): nn.SpatialMaxPooling(3x3, 1,1, 1,1)
| (2): nn.SpatialConvolution(192 -> 32, 1x1)
| (3): nn.SpatialBatchNormalization
| (4): nn.ReLU
| }
|`-> (4): nn.Sequential {
[input -> (1) -> (2) -> (3) -> output]
(1): nn.SpatialConvolution(192 -> 64, 1x1)
(2): nn.SpatialBatchNormalization
(3): nn.ReLU
}
... -> output
}
7 nn.DepthConcat {
input
|`-> (1): nn.Sequential {
| [input -> (1) -> (2) -> (3) -> (4) -> (5) -> (6) -> output]
| (1): nn.SpatialConvolution(192 -> 96, 1x1)
| (2): nn.SpatialBatchNormalization
| (3): nn.ReLU
| (4): nn.SpatialConvolution(96 -> 128, 3x3, 1,1, 1,1)
| (5): nn.SpatialBatchNormalization
| (6): nn.ReLU
| }
|`-> (2): nn.Sequential {
| [input -> (1) -> (2) -> (3) -> (4) -> (5) -> (6) -> output]
| (1): nn.SpatialConvolution(192 -> 16, 1x1)
| (2): nn.SpatialBatchNormalization
| (3): nn.ReLU
| (4): nn.SpatialConvolution(16 -> 32, 5x5, 1,1, 2,2)
| (5): nn.SpatialBatchNormalization
| (6): nn.ReLU
| }
|`-> (3): nn.Sequential {
| [input -> (1) -> (2) -> (3) -> (4) -> output]
| (1): nn.SpatialMaxPooling(3x3, 1,1, 1,1)
| (2): nn.SpatialConvolution(192 -> 32, 1x1)
| (3): nn.SpatialBatchNormalization
| (4): nn.ReLU
| }
|`-> (4): nn.Sequential {
[input -> (1) -> (2) -> (3) -> output]
(1): nn.SpatialConvolution(192 -> 64, 1x1)
(2): nn.SpatialBatchNormalization
(3): nn.ReLU
}
... -> output
}
8 nn.Sequential {
[input -> (1) -> (2) -> (3) -> (4) -> (5) -> (6) -> output]
(1): nn.SpatialConvolution(192 -> 96, 1x1)
(2): nn.SpatialBatchNormalization
(3): nn.ReLU
(4): nn.SpatialConvolution(96 -> 128, 3x3, 1,1, 1,1)
(5): nn.SpatialBatchNormalization
(6): nn.ReLU
}
9 nn.SpatialConvolution(192 -> 96, 1x1)
10 nn.SpatialBatchNormalization
11 nn.ReLU
12 nn.SpatialConvolution(96 -> 128, 3x3, 1,1, 1,1)
13 nn.SpatialBatchNormalization
14 nn.ReLU
15 nn.Sequential {
[input -> (1) -> (2) -> (3) -> (4) -> (5) -> (6) -> output]
(1): nn.SpatialConvolution(192 -> 16, 1x1)
(2): nn.SpatialBatchNormalization
(3): nn.ReLU
(4): nn.SpatialConvolution(16 -> 32, 5x5, 1,1, 2,2)
(5): nn.SpatialBatchNormalization
(6): nn.ReLU
}
16 nn.SpatialConvolution(192 -> 16, 1x1)
17 nn.SpatialBatchNormalization
18 nn.ReLU
19 nn.SpatialConvolution(16 -> 32, 5x5, 1,1, 2,2)
20 nn.SpatialBatchNormalization
21 nn.ReLU
22 nn.Sequential {
[input -> (1) -> (2) -> (3) -> (4) -> output]
(1): nn.SpatialMaxPooling(3x3, 1,1, 1,1)
(2): nn.SpatialConvolution(192 -> 32, 1x1)
(3): nn.SpatialBatchNormalization
(4): nn.ReLU
}
23 nn.SpatialMaxPooling(3x3, 1,1, 1,1)
24 nn.SpatialConvolution(192 -> 32, 1x1)
25 nn.SpatialBatchNormalization
26 nn.ReLU
27 nn.Sequential {
[input -> (1) -> (2) -> (3) -> output]
(1): nn.SpatialConvolution(192 -> 64, 1x1)
(2): nn.SpatialBatchNormalization
(3): nn.ReLU
}
28 nn.SpatialConvolution(192 -> 64, 1x1)
29 nn.SpatialBatchNormalization
30 nn.ReLU
31 nn.Inception @ nn.DepthConcat {
input
|`-> (1): nn.Sequential {
| [input -> (1) -> (2) -> (3) -> (4) -> output]
| (1): nn.SpatialConvolution(256 -> 96, 1x1)
| (2): nn.ReLU
| (3): nn.SpatialConvolution(96 -> 128, 3x3, 1,1, 1,1)
| (4): nn.ReLU
| }
|`-> (2): nn.Sequential {
| [input -> (1) -> (2) -> (3) -> (4) -> output]
| (1): nn.SpatialConvolution(256 -> 32, 1x1)
| (2): nn.ReLU
| (3): nn.SpatialConvolution(32 -> 64, 5x5, 1,1, 2,2)
| (4): nn.ReLU
| }
|`-> (3): nn.Sequential {
| [input -> (1) -> (2) -> (3) -> output]
| (1): nn.Sequential {
| [input -> (1) -> (2) -> (3) -> (4) -> output]
| (1): nn.Square
| (2): nn.SpatialAveragePooling(3x3, 1,1)
| (3): nn.MulConstant
| (4): nn.Sqrt
| }
| (2): nn.SpatialConvolution(256 -> 64, 1x1)
| (3): nn.ReLU
| }
|`-> (4): nn.Sequential {
[input -> (1) -> (2) -> output]
(1): nn.SpatialConvolution(256 -> 64, 1x1)
(2): nn.ReLU
}
... -> output
}
32 nn.DepthConcat {
input
|`-> (1): nn.Sequential {
| [input -> (1) -> (2) -> (3) -> (4) -> output]
| (1): nn.SpatialConvolution(256 -> 96, 1x1)
| (2): nn.ReLU
| (3): nn.SpatialConvolution(96 -> 128, 3x3, 1,1, 1,1)
| (4): nn.ReLU
| }
|`-> (2): nn.Sequential {
| [input -> (1) -> (2) -> (3) -> (4) -> output]
| (1): nn.SpatialConvolution(256 -> 32, 1x1)
| (2): nn.ReLU
| (3): nn.SpatialConvolution(32 -> 64, 5x5, 1,1, 2,2)
| (4): nn.ReLU
| }
|`-> (3): nn.Sequential {
| [input -> (1) -> (2) -> (3) -> output]
| (1): nn.Sequential {
| [input -> (1) -> (2) -> (3) -> (4) -> output]
| (1): nn.Square
| (2): nn.SpatialAveragePooling(3x3, 1,1)
| (3): nn.MulConstant
| (4): nn.Sqrt
| }
| (2): nn.SpatialConvolution(256 -> 64, 1x1)
| (3): nn.ReLU
| }
|`-> (4): nn.Sequential {
[input -> (1) -> (2) -> output]
(1): nn.SpatialConvolution(256 -> 64, 1x1)
(2): nn.ReLU
}
... -> output
}
33 nn.Sequential {
[input -> (1) -> (2) -> (3) -> (4) -> output]
(1): nn.SpatialConvolution(256 -> 96, 1x1)
(2): nn.ReLU
(3): nn.SpatialConvolution(96 -> 128, 3x3, 1,1, 1,1)
(4): nn.ReLU
}
34 nn.SpatialConvolution(256 -> 96, 1x1)
35 nn.ReLU
36 nn.SpatialConvolution(96 -> 128, 3x3, 1,1, 1,1)
37 nn.ReLU
38 nn.Sequential {
[input -> (1) -> (2) -> (3) -> (4) -> output]
(1): nn.SpatialConvolution(256 -> 32, 1x1)
(2): nn.ReLU
(3): nn.SpatialConvolution(32 -> 64, 5x5, 1,1, 2,2)
(4): nn.ReLU
}
39 nn.SpatialConvolution(256 -> 32, 1x1)
40 nn.ReLU
41 nn.SpatialConvolution(32 -> 64, 5x5, 1,1, 2,2)
42 nn.ReLU
43 nn.Sequential {
[input -> (1) -> (2) -> (3) -> output]
(1): nn.Sequential {
[input -> (1) -> (2) -> (3) -> (4) -> output]
(1): nn.Square
(2): nn.SpatialAveragePooling(3x3, 1,1)
(3): nn.MulConstant
(4): nn.Sqrt
}
(2): nn.SpatialConvolution(256 -> 64, 1x1)
(3): nn.ReLU
}
44 nn.Sequential {
[input -> (1) -> (2) -> (3) -> (4) -> output]
(1): nn.Square
(2): nn.SpatialAveragePooling(3x3, 1,1)
(3): nn.MulConstant
(4): nn.Sqrt
}
45 nn.Square
46 nn.SpatialAveragePooling(3x3, 1,1)
47 nn.MulConstant
48 nn.Sqrt
49 nn.SpatialConvolution(256 -> 64, 1x1)
50 nn.ReLU
51 nn.Sequential {
[input -> (1) -> (2) -> output]
(1): nn.SpatialConvolution(256 -> 64, 1x1)
(2): nn.ReLU
}
52 nn.SpatialConvolution(256 -> 64, 1x1)
53 nn.ReLU
54 nn.SpatialAveragePooling(7x7, 1,1)
55 nn.View(320)
56 nn.Linear(320 -> 128)
57 nn.Normalize(2)
由上述结果可以看出,使用apply后,第1次输出整个模型,此处为最顶层的。
第2-5次输出:
2 nn.SpatialConvolutionMM(3 -> 64, 7x7, 2,2, 3,3)
3 nn.SpatialBatchNormalization
4 nn.ReLU
5 nn.SpatialMaxPooling(3x3, 2,2, 1,1)
为Inception之前的几个层。
第6次为nn.Inception @ nn.DepthConcat,第7次为nn.DepthConcat。此处是第一个Inceptioin层。
第8次为Inception的第一个nn.Sequential,第9-14次为该层的具体层。此时已经到了第一个最底层。
第15次为Inception的第二个nn.Sequential,第16-21次为该层的具体层。此时已经到了第二个最底层。
第22次为Inception的第三个nn.Sequential,第23-26次为该层的具体层。此时已经到了第三个最底层。
第27次为Inception的第四个nn.Sequential,第28-30次为该层的具体层。此时已经到了第四个最底层。
至此,第一个Inception层通过深度优先的方式遍历完毕。
第31次为nn.Inception @ nn.DepthConcat,第32次为nn.DepthConcat。此处是第二个Inceptioin层(注意,为了区分第一个Inception和第二个Inception层,这两个层具体结构不完全一样)。
第33次为Inception的第一个nn.Sequential,第34-37次为该层的具体层。此时已经到了第一个最底层。
第38次为Inception的第二个nn.Sequential,第39-42次为该层的具体层。此时已经到了第二个最底层。
第43次为Inception的第三个nn.Sequential。
第44次为第三个nn.Sequential的第一个小module(也是一个nn.Sequential)。第45-48依次遍历此nn.Sequential。到了最底层后遍历完毕。
第49-50为第三个nn.Sequential的最后两层。
第51次为Inception的第四个nn.Sequential,第52-53次为该层的具体层。此时已经到了第四个最底层。
至此,第二个Inception层通过深度优先的方式遍历完毕。
第54-57为最后的两个层。
由上面可以看出,apply采用的是深度优先的方式进行遍历。
(原)torch的apply函数的更多相关文章
- R语言中apply函数
前言 刚开始接触R语言时,会听到各种的R语言使用技巧,其中最重要的一条就是不要用循环,效率特别低,要用向量计算代替循环计算. 那么,这是为什么呢?原因在于R的循环操作for和while,都是基于R语言 ...
- js中bind、call、apply函数的用法
最近一直在用 js 写游戏服务器,我也接触 js 时间不长,大学的时候用 js 做过一个 H3C 的 web的项目,然后在腾讯实习的时候用 js 写过一些奇怪的程序,自己也用 js 写过几个的网站.但 ...
- 关于call和apply函数的区别及用法
call和apply函数是function函数的基本属性,都可以用于更改函数对象和传递参数,是前端工程师常用的函数.具体使用方法请参考以下案列: 例如: 申明函数: var fn = function ...
- Javascript中bind、call、apply函数用法
js 里函数调用有 4 种模式:方法调用.正常函数调用.构造器函数调用.apply/call 调用. 同时,无论哪种函数调用除了你声明时定义的形参外,还会自动添加 2 个形参,分别是 this 和ar ...
- (2)apply函数及其源码
本文原创,转载请注明出处,本人Q1273314690(交流学习) 总结: 就是MARGIN决定了你的FUN调用几次,每次传递给你的是什么维度的内容,而...是传递给FUN的(每次调用的时候都会被传 ...
- Javascript中call函数和apply函数的使用
Javascript 中call函数和apply的使用: Javascript中的call函数和apply函数是对执行上下文进行切换,是将一个函数从当前执行的上下文切换到另一个对象中执行,例如: so ...
- 博文推荐】Javascript中bind、call、apply函数用法
[博文推荐]Javascript中bind.call.apply函数用法 2015-03-02 09:22 菜鸟浮出水 51CTO博客 字号:T | T 最近一直在用 js 写游戏服务器,我也接触 j ...
- call与apply函数
call与apply函数 1.为什么需要call与apply函数 Javascript中,每一个函数内部都有一个特殊的关键词this,其随着所处环境的不同其指向也是不同的. 函数的内部其this也是指 ...
- JavaScript Function.apply() 函数详解
apply()函数用于调用当前函数functionObject,并可同时使用指定对象thisObj作为本次函数执行时函数内部的this指针引用. 该函数属于Function对象,所有主流浏览器均支持该 ...
随机推荐
- 机器人操作系统ROS | 简介篇
同样,从个人微信公众号Nao(ID:qRobotics)搬运. 前言 先放一个ROS Industrial一周年剪辑视频. ROS已经发布八周年了,在国外科研机构中非常受欢迎.目前,以美国西南研究院为 ...
- OSSEC配置
http://gavinshaw.blog.51cto.com/385947/1020540
- git “bad index file sha1 signature fatal: index file corrupt”错误
在执行commit或revert等操作时,提示“bad index file sha1 signature fatal: index file corrupt”错误,导致操作失败.这是由于git的in ...
- js执行引擎(js解释器)
看字面理解,js执行引擎讲的就是将js代码转化为机器码并执行的过程. 一款 JavaScript 引擎是由 Brendan Eich 在网景的 Navigator 中开发的,它的名字叫做 Spider ...
- HDU_1239——再次调用外星智慧
Problem Description A message from humans to extraterrestrial intelligence was sent through the Arec ...
- windows内核对象句柄
内核对象用于管理进程.线程和文件等诸多种类的大量资源,每一个内核对象都只是一个句内存快,它由操作系统内核分配,并只能右操作系统内核访问.这个内存块是一个数据结构,其维护着与对象相关的信息,其中少数成员 ...
- HDU-1078
Problem Description FatMouse has stored some cheese in a city. The city can be considered as a squar ...
- HDU-3661(贪心)
Problem Description In a factory, there are N workers to finish two types of tasks (A and B). Each t ...
- 扒一扒ReentrantLock以及AQS实现原理
提到JAVA加锁,我们通常会想到synchronized关键字或者是Java Concurrent Util(后面简称JCU)包下面的Lock,今天就来扒一扒Lock是如何实现的,比如我们可以先提出一 ...
- hadoop2.2.0 MapReduce的序列化
package com.my.hadoop.mapreduce.dataformat; import java.io.DataInput;import java.io.DataOutput;impor ...