andytonglove commited on
Commit
64e0e46
·
verified ·
1 Parent(s): 20dbca5

Upload 4 files

Browse files
Files changed (4) hide show
  1. detect.caffemodel +0 -0
  2. detect.prototxt +2716 -0
  3. sr.caffemodel +0 -0
  4. sr.prototxt +403 -0
detect.caffemodel ADDED
Binary file (965 kB). View file
 
detect.prototxt ADDED
@@ -0,0 +1,2716 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ layer {
2
+ name: "data"
3
+ type: "Input"
4
+ top: "data"
5
+ input_param {
6
+ shape {
7
+ dim: 1
8
+ dim: 1
9
+ dim: 384
10
+ dim: 384
11
+ }
12
+ }
13
+ }
14
+ layer {
15
+ name: "data/bn"
16
+ type: "BatchNorm"
17
+ bottom: "data"
18
+ top: "data"
19
+ param {
20
+ lr_mult: 0.0
21
+ decay_mult: 0.0
22
+ }
23
+ param {
24
+ lr_mult: 0.0
25
+ decay_mult: 0.0
26
+ }
27
+ param {
28
+ lr_mult: 0.0
29
+ decay_mult: 0.0
30
+ }
31
+ }
32
+ layer {
33
+ name: "data/bn/scale"
34
+ type: "Scale"
35
+ bottom: "data"
36
+ top: "data"
37
+ param {
38
+ lr_mult: 1.0
39
+ decay_mult: 0.0
40
+ }
41
+ param {
42
+ lr_mult: 1.0
43
+ decay_mult: 0.0
44
+ }
45
+ scale_param {
46
+ filler {
47
+ type: "constant"
48
+ value: 1.0
49
+ }
50
+ bias_term: true
51
+ bias_filler {
52
+ type: "constant"
53
+ value: 0.0
54
+ }
55
+ }
56
+ }
57
+ layer {
58
+ name: "stage1"
59
+ type: "Convolution"
60
+ bottom: "data"
61
+ top: "stage1"
62
+ param {
63
+ lr_mult: 1.0
64
+ decay_mult: 1.0
65
+ }
66
+ param {
67
+ lr_mult: 1.0
68
+ decay_mult: 0.0
69
+ }
70
+ convolution_param {
71
+ num_output: 24
72
+ bias_term: true
73
+ pad: 1
74
+ kernel_size: 3
75
+ group: 1
76
+ stride: 2
77
+ weight_filler {
78
+ type: "msra"
79
+ }
80
+ dilation: 1
81
+ }
82
+ }
83
+ layer {
84
+ name: "stage1/bn"
85
+ type: "BatchNorm"
86
+ bottom: "stage1"
87
+ top: "stage1"
88
+ param {
89
+ lr_mult: 0.0
90
+ decay_mult: 0.0
91
+ }
92
+ param {
93
+ lr_mult: 0.0
94
+ decay_mult: 0.0
95
+ }
96
+ param {
97
+ lr_mult: 0.0
98
+ decay_mult: 0.0
99
+ }
100
+ }
101
+ layer {
102
+ name: "stage1/bn/scale"
103
+ type: "Scale"
104
+ bottom: "stage1"
105
+ top: "stage1"
106
+ param {
107
+ lr_mult: 1.0
108
+ decay_mult: 0.0
109
+ }
110
+ param {
111
+ lr_mult: 1.0
112
+ decay_mult: 0.0
113
+ }
114
+ scale_param {
115
+ filler {
116
+ type: "constant"
117
+ value: 1.0
118
+ }
119
+ bias_term: true
120
+ bias_filler {
121
+ type: "constant"
122
+ value: 0.0
123
+ }
124
+ }
125
+ }
126
+ layer {
127
+ name: "stage1/relu"
128
+ type: "ReLU"
129
+ bottom: "stage1"
130
+ top: "stage1"
131
+ }
132
+ layer {
133
+ name: "stage2"
134
+ type: "Pooling"
135
+ bottom: "stage1"
136
+ top: "stage2"
137
+ pooling_param {
138
+ pool: MAX
139
+ kernel_size: 3
140
+ stride: 2
141
+ pad: 0
142
+ }
143
+ }
144
+ layer {
145
+ name: "stage3_1/conv1"
146
+ type: "Convolution"
147
+ bottom: "stage2"
148
+ top: "stage3_1/conv1"
149
+ param {
150
+ lr_mult: 1.0
151
+ decay_mult: 1.0
152
+ }
153
+ convolution_param {
154
+ num_output: 16
155
+ pad: 0
156
+ kernel_size: 1
157
+ group: 1
158
+ stride: 1
159
+ weight_filler {
160
+ type: "msra"
161
+ }
162
+ dilation: 1
163
+ }
164
+ }
165
+ layer {
166
+ name: "stage3_1/conv1/relu"
167
+ type: "ReLU"
168
+ bottom: "stage3_1/conv1"
169
+ top: "stage3_1/conv1"
170
+ }
171
+ layer {
172
+ name: "stage3_1/conv2"
173
+ type: "Convolution"
174
+ bottom: "stage3_1/conv1"
175
+ top: "stage3_1/conv2"
176
+ param {
177
+ lr_mult: 1.0
178
+ decay_mult: 1.0
179
+ }
180
+ convolution_param {
181
+ num_output: 16
182
+ pad: 1
183
+ kernel_size: 3
184
+ group: 16
185
+ stride: 2
186
+ weight_filler {
187
+ type: "msra"
188
+ }
189
+ dilation: 1
190
+ }
191
+ }
192
+ layer {
193
+ name: "stage3_1/conv3"
194
+ type: "Convolution"
195
+ bottom: "stage3_1/conv2"
196
+ top: "stage3_1/conv3"
197
+ param {
198
+ lr_mult: 1.0
199
+ decay_mult: 1.0
200
+ }
201
+ convolution_param {
202
+ num_output: 64
203
+ pad: 0
204
+ kernel_size: 1
205
+ group: 1
206
+ stride: 1
207
+ weight_filler {
208
+ type: "msra"
209
+ }
210
+ dilation: 1
211
+ }
212
+ }
213
+ layer {
214
+ name: "stage3_1/relu"
215
+ type: "ReLU"
216
+ bottom: "stage3_1/conv3"
217
+ top: "stage3_1/conv3"
218
+ }
219
+ layer {
220
+ name: "stage3_2/conv1"
221
+ type: "Convolution"
222
+ bottom: "stage3_1/conv3"
223
+ top: "stage3_2/conv1"
224
+ param {
225
+ lr_mult: 1.0
226
+ decay_mult: 1.0
227
+ }
228
+ convolution_param {
229
+ num_output: 16
230
+ pad: 0
231
+ kernel_size: 1
232
+ group: 1
233
+ stride: 1
234
+ weight_filler {
235
+ type: "msra"
236
+ }
237
+ dilation: 1
238
+ }
239
+ }
240
+ layer {
241
+ name: "stage3_2/conv1/relu"
242
+ type: "ReLU"
243
+ bottom: "stage3_2/conv1"
244
+ top: "stage3_2/conv1"
245
+ }
246
+ layer {
247
+ name: "stage3_2/conv2"
248
+ type: "Convolution"
249
+ bottom: "stage3_2/conv1"
250
+ top: "stage3_2/conv2"
251
+ param {
252
+ lr_mult: 1.0
253
+ decay_mult: 1.0
254
+ }
255
+ convolution_param {
256
+ num_output: 16
257
+ pad: 1
258
+ kernel_size: 3
259
+ group: 16
260
+ stride: 1
261
+ weight_filler {
262
+ type: "msra"
263
+ }
264
+ dilation: 1
265
+ }
266
+ }
267
+ layer {
268
+ name: "stage3_2/conv3"
269
+ type: "Convolution"
270
+ bottom: "stage3_2/conv2"
271
+ top: "stage3_2/conv3"
272
+ param {
273
+ lr_mult: 1.0
274
+ decay_mult: 1.0
275
+ }
276
+ convolution_param {
277
+ num_output: 64
278
+ pad: 0
279
+ kernel_size: 1
280
+ group: 1
281
+ stride: 1
282
+ weight_filler {
283
+ type: "msra"
284
+ }
285
+ dilation: 1
286
+ }
287
+ }
288
+ layer {
289
+ name: "stage3_2/sum"
290
+ type: "Eltwise"
291
+ bottom: "stage3_1/conv3"
292
+ bottom: "stage3_2/conv3"
293
+ top: "stage3_2/sum"
294
+ eltwise_param {
295
+ operation: SUM
296
+ }
297
+ }
298
+ layer {
299
+ name: "stage3_2/relu"
300
+ type: "ReLU"
301
+ bottom: "stage3_2/sum"
302
+ top: "stage3_2/sum"
303
+ }
304
+ layer {
305
+ name: "stage3_3/conv1"
306
+ type: "Convolution"
307
+ bottom: "stage3_2/sum"
308
+ top: "stage3_3/conv1"
309
+ param {
310
+ lr_mult: 1.0
311
+ decay_mult: 1.0
312
+ }
313
+ convolution_param {
314
+ num_output: 16
315
+ pad: 0
316
+ kernel_size: 1
317
+ group: 1
318
+ stride: 1
319
+ weight_filler {
320
+ type: "msra"
321
+ }
322
+ dilation: 1
323
+ }
324
+ }
325
+ layer {
326
+ name: "stage3_3/conv1/relu"
327
+ type: "ReLU"
328
+ bottom: "stage3_3/conv1"
329
+ top: "stage3_3/conv1"
330
+ }
331
+ layer {
332
+ name: "stage3_3/conv2"
333
+ type: "Convolution"
334
+ bottom: "stage3_3/conv1"
335
+ top: "stage3_3/conv2"
336
+ param {
337
+ lr_mult: 1.0
338
+ decay_mult: 1.0
339
+ }
340
+ convolution_param {
341
+ num_output: 16
342
+ pad: 1
343
+ kernel_size: 3
344
+ group: 16
345
+ stride: 1
346
+ weight_filler {
347
+ type: "msra"
348
+ }
349
+ dilation: 1
350
+ }
351
+ }
352
+ layer {
353
+ name: "stage3_3/conv3"
354
+ type: "Convolution"
355
+ bottom: "stage3_3/conv2"
356
+ top: "stage3_3/conv3"
357
+ param {
358
+ lr_mult: 1.0
359
+ decay_mult: 1.0
360
+ }
361
+ convolution_param {
362
+ num_output: 64
363
+ pad: 0
364
+ kernel_size: 1
365
+ group: 1
366
+ stride: 1
367
+ weight_filler {
368
+ type: "msra"
369
+ }
370
+ dilation: 1
371
+ }
372
+ }
373
+ layer {
374
+ name: "stage3_3/sum"
375
+ type: "Eltwise"
376
+ bottom: "stage3_2/sum"
377
+ bottom: "stage3_3/conv3"
378
+ top: "stage3_3/sum"
379
+ eltwise_param {
380
+ operation: SUM
381
+ }
382
+ }
383
+ layer {
384
+ name: "stage3_3/relu"
385
+ type: "ReLU"
386
+ bottom: "stage3_3/sum"
387
+ top: "stage3_3/sum"
388
+ }
389
+ layer {
390
+ name: "stage3_4/conv1"
391
+ type: "Convolution"
392
+ bottom: "stage3_3/sum"
393
+ top: "stage3_4/conv1"
394
+ param {
395
+ lr_mult: 1.0
396
+ decay_mult: 1.0
397
+ }
398
+ convolution_param {
399
+ num_output: 16
400
+ pad: 0
401
+ kernel_size: 1
402
+ group: 1
403
+ stride: 1
404
+ weight_filler {
405
+ type: "msra"
406
+ }
407
+ dilation: 1
408
+ }
409
+ }
410
+ layer {
411
+ name: "stage3_4/conv1/relu"
412
+ type: "ReLU"
413
+ bottom: "stage3_4/conv1"
414
+ top: "stage3_4/conv1"
415
+ }
416
+ layer {
417
+ name: "stage3_4/conv2"
418
+ type: "Convolution"
419
+ bottom: "stage3_4/conv1"
420
+ top: "stage3_4/conv2"
421
+ param {
422
+ lr_mult: 1.0
423
+ decay_mult: 1.0
424
+ }
425
+ convolution_param {
426
+ num_output: 16
427
+ pad: 1
428
+ kernel_size: 3
429
+ group: 16
430
+ stride: 1
431
+ weight_filler {
432
+ type: "msra"
433
+ }
434
+ dilation: 1
435
+ }
436
+ }
437
+ layer {
438
+ name: "stage3_4/conv3"
439
+ type: "Convolution"
440
+ bottom: "stage3_4/conv2"
441
+ top: "stage3_4/conv3"
442
+ param {
443
+ lr_mult: 1.0
444
+ decay_mult: 1.0
445
+ }
446
+ convolution_param {
447
+ num_output: 64
448
+ pad: 0
449
+ kernel_size: 1
450
+ group: 1
451
+ stride: 1
452
+ weight_filler {
453
+ type: "msra"
454
+ }
455
+ dilation: 1
456
+ }
457
+ }
458
+ layer {
459
+ name: "stage3_4/sum"
460
+ type: "Eltwise"
461
+ bottom: "stage3_3/sum"
462
+ bottom: "stage3_4/conv3"
463
+ top: "stage3_4/sum"
464
+ eltwise_param {
465
+ operation: SUM
466
+ }
467
+ }
468
+ layer {
469
+ name: "stage3_4/relu"
470
+ type: "ReLU"
471
+ bottom: "stage3_4/sum"
472
+ top: "stage3_4/sum"
473
+ }
474
+ layer {
475
+ name: "stage4_1/conv1"
476
+ type: "Convolution"
477
+ bottom: "stage3_4/sum"
478
+ top: "stage4_1/conv1"
479
+ param {
480
+ lr_mult: 1.0
481
+ decay_mult: 1.0
482
+ }
483
+ convolution_param {
484
+ num_output: 32
485
+ pad: 0
486
+ kernel_size: 1
487
+ group: 1
488
+ stride: 1
489
+ weight_filler {
490
+ type: "msra"
491
+ }
492
+ dilation: 1
493
+ }
494
+ }
495
+ layer {
496
+ name: "stage4_1/conv1/relu"
497
+ type: "ReLU"
498
+ bottom: "stage4_1/conv1"
499
+ top: "stage4_1/conv1"
500
+ }
501
+ layer {
502
+ name: "stage4_1/conv2"
503
+ type: "Convolution"
504
+ bottom: "stage4_1/conv1"
505
+ top: "stage4_1/conv2"
506
+ param {
507
+ lr_mult: 1.0
508
+ decay_mult: 1.0
509
+ }
510
+ convolution_param {
511
+ num_output: 32
512
+ pad: 1
513
+ kernel_size: 3
514
+ group: 32
515
+ stride: 2
516
+ weight_filler {
517
+ type: "msra"
518
+ }
519
+ dilation: 1
520
+ }
521
+ }
522
+ layer {
523
+ name: "stage4_1/conv3"
524
+ type: "Convolution"
525
+ bottom: "stage4_1/conv2"
526
+ top: "stage4_1/conv3"
527
+ param {
528
+ lr_mult: 1.0
529
+ decay_mult: 1.0
530
+ }
531
+ convolution_param {
532
+ num_output: 128
533
+ pad: 0
534
+ kernel_size: 1
535
+ group: 1
536
+ stride: 1
537
+ weight_filler {
538
+ type: "msra"
539
+ }
540
+ dilation: 1
541
+ }
542
+ }
543
+ layer {
544
+ name: "stage4_1/relu"
545
+ type: "ReLU"
546
+ bottom: "stage4_1/conv3"
547
+ top: "stage4_1/conv3"
548
+ }
549
+ layer {
550
+ name: "stage4_2/conv1"
551
+ type: "Convolution"
552
+ bottom: "stage4_1/conv3"
553
+ top: "stage4_2/conv1"
554
+ param {
555
+ lr_mult: 1.0
556
+ decay_mult: 1.0
557
+ }
558
+ convolution_param {
559
+ num_output: 32
560
+ pad: 0
561
+ kernel_size: 1
562
+ group: 1
563
+ stride: 1
564
+ weight_filler {
565
+ type: "msra"
566
+ }
567
+ dilation: 1
568
+ }
569
+ }
570
+ layer {
571
+ name: "stage4_2/conv1/relu"
572
+ type: "ReLU"
573
+ bottom: "stage4_2/conv1"
574
+ top: "stage4_2/conv1"
575
+ }
576
+ layer {
577
+ name: "stage4_2/conv2"
578
+ type: "Convolution"
579
+ bottom: "stage4_2/conv1"
580
+ top: "stage4_2/conv2"
581
+ param {
582
+ lr_mult: 1.0
583
+ decay_mult: 1.0
584
+ }
585
+ convolution_param {
586
+ num_output: 32
587
+ pad: 1
588
+ kernel_size: 3
589
+ group: 32
590
+ stride: 1
591
+ weight_filler {
592
+ type: "msra"
593
+ }
594
+ dilation: 1
595
+ }
596
+ }
597
+ layer {
598
+ name: "stage4_2/conv3"
599
+ type: "Convolution"
600
+ bottom: "stage4_2/conv2"
601
+ top: "stage4_2/conv3"
602
+ param {
603
+ lr_mult: 1.0
604
+ decay_mult: 1.0
605
+ }
606
+ convolution_param {
607
+ num_output: 128
608
+ pad: 0
609
+ kernel_size: 1
610
+ group: 1
611
+ stride: 1
612
+ weight_filler {
613
+ type: "msra"
614
+ }
615
+ dilation: 1
616
+ }
617
+ }
618
+ layer {
619
+ name: "stage4_2/sum"
620
+ type: "Eltwise"
621
+ bottom: "stage4_1/conv3"
622
+ bottom: "stage4_2/conv3"
623
+ top: "stage4_2/sum"
624
+ eltwise_param {
625
+ operation: SUM
626
+ }
627
+ }
628
+ layer {
629
+ name: "stage4_2/relu"
630
+ type: "ReLU"
631
+ bottom: "stage4_2/sum"
632
+ top: "stage4_2/sum"
633
+ }
634
+ layer {
635
+ name: "stage4_3/conv1"
636
+ type: "Convolution"
637
+ bottom: "stage4_2/sum"
638
+ top: "stage4_3/conv1"
639
+ param {
640
+ lr_mult: 1.0
641
+ decay_mult: 1.0
642
+ }
643
+ convolution_param {
644
+ num_output: 32
645
+ pad: 0
646
+ kernel_size: 1
647
+ group: 1
648
+ stride: 1
649
+ weight_filler {
650
+ type: "msra"
651
+ }
652
+ dilation: 1
653
+ }
654
+ }
655
+ layer {
656
+ name: "stage4_3/conv1/relu"
657
+ type: "ReLU"
658
+ bottom: "stage4_3/conv1"
659
+ top: "stage4_3/conv1"
660
+ }
661
+ layer {
662
+ name: "stage4_3/conv2"
663
+ type: "Convolution"
664
+ bottom: "stage4_3/conv1"
665
+ top: "stage4_3/conv2"
666
+ param {
667
+ lr_mult: 1.0
668
+ decay_mult: 1.0
669
+ }
670
+ convolution_param {
671
+ num_output: 32
672
+ pad: 1
673
+ kernel_size: 3
674
+ group: 32
675
+ stride: 1
676
+ weight_filler {
677
+ type: "msra"
678
+ }
679
+ dilation: 1
680
+ }
681
+ }
682
+ layer {
683
+ name: "stage4_3/conv3"
684
+ type: "Convolution"
685
+ bottom: "stage4_3/conv2"
686
+ top: "stage4_3/conv3"
687
+ param {
688
+ lr_mult: 1.0
689
+ decay_mult: 1.0
690
+ }
691
+ convolution_param {
692
+ num_output: 128
693
+ pad: 0
694
+ kernel_size: 1
695
+ group: 1
696
+ stride: 1
697
+ weight_filler {
698
+ type: "msra"
699
+ }
700
+ dilation: 1
701
+ }
702
+ }
703
+ layer {
704
+ name: "stage4_3/sum"
705
+ type: "Eltwise"
706
+ bottom: "stage4_2/sum"
707
+ bottom: "stage4_3/conv3"
708
+ top: "stage4_3/sum"
709
+ eltwise_param {
710
+ operation: SUM
711
+ }
712
+ }
713
+ layer {
714
+ name: "stage4_3/relu"
715
+ type: "ReLU"
716
+ bottom: "stage4_3/sum"
717
+ top: "stage4_3/sum"
718
+ }
719
+ layer {
720
+ name: "stage4_4/conv1"
721
+ type: "Convolution"
722
+ bottom: "stage4_3/sum"
723
+ top: "stage4_4/conv1"
724
+ param {
725
+ lr_mult: 1.0
726
+ decay_mult: 1.0
727
+ }
728
+ convolution_param {
729
+ num_output: 32
730
+ pad: 0
731
+ kernel_size: 1
732
+ group: 1
733
+ stride: 1
734
+ weight_filler {
735
+ type: "msra"
736
+ }
737
+ dilation: 1
738
+ }
739
+ }
740
+ layer {
741
+ name: "stage4_4/conv1/relu"
742
+ type: "ReLU"
743
+ bottom: "stage4_4/conv1"
744
+ top: "stage4_4/conv1"
745
+ }
746
+ layer {
747
+ name: "stage4_4/conv2"
748
+ type: "Convolution"
749
+ bottom: "stage4_4/conv1"
750
+ top: "stage4_4/conv2"
751
+ param {
752
+ lr_mult: 1.0
753
+ decay_mult: 1.0
754
+ }
755
+ convolution_param {
756
+ num_output: 32
757
+ pad: 1
758
+ kernel_size: 3
759
+ group: 32
760
+ stride: 1
761
+ weight_filler {
762
+ type: "msra"
763
+ }
764
+ dilation: 1
765
+ }
766
+ }
767
+ layer {
768
+ name: "stage4_4/conv3"
769
+ type: "Convolution"
770
+ bottom: "stage4_4/conv2"
771
+ top: "stage4_4/conv3"
772
+ param {
773
+ lr_mult: 1.0
774
+ decay_mult: 1.0
775
+ }
776
+ convolution_param {
777
+ num_output: 128
778
+ pad: 0
779
+ kernel_size: 1
780
+ group: 1
781
+ stride: 1
782
+ weight_filler {
783
+ type: "msra"
784
+ }
785
+ dilation: 1
786
+ }
787
+ }
788
+ layer {
789
+ name: "stage4_4/sum"
790
+ type: "Eltwise"
791
+ bottom: "stage4_3/sum"
792
+ bottom: "stage4_4/conv3"
793
+ top: "stage4_4/sum"
794
+ eltwise_param {
795
+ operation: SUM
796
+ }
797
+ }
798
+ layer {
799
+ name: "stage4_4/relu"
800
+ type: "ReLU"
801
+ bottom: "stage4_4/sum"
802
+ top: "stage4_4/sum"
803
+ }
804
+ layer {
805
+ name: "stage4_5/conv1"
806
+ type: "Convolution"
807
+ bottom: "stage4_4/sum"
808
+ top: "stage4_5/conv1"
809
+ param {
810
+ lr_mult: 1.0
811
+ decay_mult: 1.0
812
+ }
813
+ convolution_param {
814
+ num_output: 32
815
+ pad: 0
816
+ kernel_size: 1
817
+ group: 1
818
+ stride: 1
819
+ weight_filler {
820
+ type: "msra"
821
+ }
822
+ dilation: 1
823
+ }
824
+ }
825
+ layer {
826
+ name: "stage4_5/conv1/relu"
827
+ type: "ReLU"
828
+ bottom: "stage4_5/conv1"
829
+ top: "stage4_5/conv1"
830
+ }
831
+ layer {
832
+ name: "stage4_5/conv2"
833
+ type: "Convolution"
834
+ bottom: "stage4_5/conv1"
835
+ top: "stage4_5/conv2"
836
+ param {
837
+ lr_mult: 1.0
838
+ decay_mult: 1.0
839
+ }
840
+ convolution_param {
841
+ num_output: 32
842
+ pad: 1
843
+ kernel_size: 3
844
+ group: 32
845
+ stride: 1
846
+ weight_filler {
847
+ type: "msra"
848
+ }
849
+ dilation: 1
850
+ }
851
+ }
852
+ layer {
853
+ name: "stage4_5/conv3"
854
+ type: "Convolution"
855
+ bottom: "stage4_5/conv2"
856
+ top: "stage4_5/conv3"
857
+ param {
858
+ lr_mult: 1.0
859
+ decay_mult: 1.0
860
+ }
861
+ convolution_param {
862
+ num_output: 128
863
+ pad: 0
864
+ kernel_size: 1
865
+ group: 1
866
+ stride: 1
867
+ weight_filler {
868
+ type: "msra"
869
+ }
870
+ dilation: 1
871
+ }
872
+ }
873
+ layer {
874
+ name: "stage4_5/sum"
875
+ type: "Eltwise"
876
+ bottom: "stage4_4/sum"
877
+ bottom: "stage4_5/conv3"
878
+ top: "stage4_5/sum"
879
+ eltwise_param {
880
+ operation: SUM
881
+ }
882
+ }
883
+ layer {
884
+ name: "stage4_5/relu"
885
+ type: "ReLU"
886
+ bottom: "stage4_5/sum"
887
+ top: "stage4_5/sum"
888
+ }
889
+ layer {
890
+ name: "stage4_6/conv1"
891
+ type: "Convolution"
892
+ bottom: "stage4_5/sum"
893
+ top: "stage4_6/conv1"
894
+ param {
895
+ lr_mult: 1.0
896
+ decay_mult: 1.0
897
+ }
898
+ convolution_param {
899
+ num_output: 32
900
+ pad: 0
901
+ kernel_size: 1
902
+ group: 1
903
+ stride: 1
904
+ weight_filler {
905
+ type: "msra"
906
+ }
907
+ dilation: 1
908
+ }
909
+ }
910
+ layer {
911
+ name: "stage4_6/conv1/relu"
912
+ type: "ReLU"
913
+ bottom: "stage4_6/conv1"
914
+ top: "stage4_6/conv1"
915
+ }
916
+ layer {
917
+ name: "stage4_6/conv2"
918
+ type: "Convolution"
919
+ bottom: "stage4_6/conv1"
920
+ top: "stage4_6/conv2"
921
+ param {
922
+ lr_mult: 1.0
923
+ decay_mult: 1.0
924
+ }
925
+ convolution_param {
926
+ num_output: 32
927
+ pad: 1
928
+ kernel_size: 3
929
+ group: 32
930
+ stride: 1
931
+ weight_filler {
932
+ type: "msra"
933
+ }
934
+ dilation: 1
935
+ }
936
+ }
937
+ layer {
938
+ name: "stage4_6/conv3"
939
+ type: "Convolution"
940
+ bottom: "stage4_6/conv2"
941
+ top: "stage4_6/conv3"
942
+ param {
943
+ lr_mult: 1.0
944
+ decay_mult: 1.0
945
+ }
946
+ convolution_param {
947
+ num_output: 128
948
+ pad: 0
949
+ kernel_size: 1
950
+ group: 1
951
+ stride: 1
952
+ weight_filler {
953
+ type: "msra"
954
+ }
955
+ dilation: 1
956
+ }
957
+ }
958
+ layer {
959
+ name: "stage4_6/sum"
960
+ type: "Eltwise"
961
+ bottom: "stage4_5/sum"
962
+ bottom: "stage4_6/conv3"
963
+ top: "stage4_6/sum"
964
+ eltwise_param {
965
+ operation: SUM
966
+ }
967
+ }
968
+ layer {
969
+ name: "stage4_6/relu"
970
+ type: "ReLU"
971
+ bottom: "stage4_6/sum"
972
+ top: "stage4_6/sum"
973
+ }
974
+ layer {
975
+ name: "stage4_7/conv1"
976
+ type: "Convolution"
977
+ bottom: "stage4_6/sum"
978
+ top: "stage4_7/conv1"
979
+ param {
980
+ lr_mult: 1.0
981
+ decay_mult: 1.0
982
+ }
983
+ convolution_param {
984
+ num_output: 32
985
+ pad: 0
986
+ kernel_size: 1
987
+ group: 1
988
+ stride: 1
989
+ weight_filler {
990
+ type: "msra"
991
+ }
992
+ dilation: 1
993
+ }
994
+ }
995
+ layer {
996
+ name: "stage4_7/conv1/relu"
997
+ type: "ReLU"
998
+ bottom: "stage4_7/conv1"
999
+ top: "stage4_7/conv1"
1000
+ }
1001
+ layer {
1002
+ name: "stage4_7/conv2"
1003
+ type: "Convolution"
1004
+ bottom: "stage4_7/conv1"
1005
+ top: "stage4_7/conv2"
1006
+ param {
1007
+ lr_mult: 1.0
1008
+ decay_mult: 1.0
1009
+ }
1010
+ convolution_param {
1011
+ num_output: 32
1012
+ pad: 1
1013
+ kernel_size: 3
1014
+ group: 32
1015
+ stride: 1
1016
+ weight_filler {
1017
+ type: "msra"
1018
+ }
1019
+ dilation: 1
1020
+ }
1021
+ }
1022
+ layer {
1023
+ name: "stage4_7/conv3"
1024
+ type: "Convolution"
1025
+ bottom: "stage4_7/conv2"
1026
+ top: "stage4_7/conv3"
1027
+ param {
1028
+ lr_mult: 1.0
1029
+ decay_mult: 1.0
1030
+ }
1031
+ convolution_param {
1032
+ num_output: 128
1033
+ pad: 0
1034
+ kernel_size: 1
1035
+ group: 1
1036
+ stride: 1
1037
+ weight_filler {
1038
+ type: "msra"
1039
+ }
1040
+ dilation: 1
1041
+ }
1042
+ }
1043
+ layer {
1044
+ name: "stage4_7/sum"
1045
+ type: "Eltwise"
1046
+ bottom: "stage4_6/sum"
1047
+ bottom: "stage4_7/conv3"
1048
+ top: "stage4_7/sum"
1049
+ eltwise_param {
1050
+ operation: SUM
1051
+ }
1052
+ }
1053
+ layer {
1054
+ name: "stage4_7/relu"
1055
+ type: "ReLU"
1056
+ bottom: "stage4_7/sum"
1057
+ top: "stage4_7/sum"
1058
+ }
1059
+ layer {
1060
+ name: "stage4_8/conv1"
1061
+ type: "Convolution"
1062
+ bottom: "stage4_7/sum"
1063
+ top: "stage4_8/conv1"
1064
+ param {
1065
+ lr_mult: 1.0
1066
+ decay_mult: 1.0
1067
+ }
1068
+ convolution_param {
1069
+ num_output: 32
1070
+ pad: 0
1071
+ kernel_size: 1
1072
+ group: 1
1073
+ stride: 1
1074
+ weight_filler {
1075
+ type: "msra"
1076
+ }
1077
+ dilation: 1
1078
+ }
1079
+ }
1080
+ layer {
1081
+ name: "stage4_8/conv1/relu"
1082
+ type: "ReLU"
1083
+ bottom: "stage4_8/conv1"
1084
+ top: "stage4_8/conv1"
1085
+ }
1086
+ layer {
1087
+ name: "stage4_8/conv2"
1088
+ type: "Convolution"
1089
+ bottom: "stage4_8/conv1"
1090
+ top: "stage4_8/conv2"
1091
+ param {
1092
+ lr_mult: 1.0
1093
+ decay_mult: 1.0
1094
+ }
1095
+ convolution_param {
1096
+ num_output: 32
1097
+ pad: 1
1098
+ kernel_size: 3
1099
+ group: 32
1100
+ stride: 1
1101
+ weight_filler {
1102
+ type: "msra"
1103
+ }
1104
+ dilation: 1
1105
+ }
1106
+ }
1107
+ layer {
1108
+ name: "stage4_8/conv3"
1109
+ type: "Convolution"
1110
+ bottom: "stage4_8/conv2"
1111
+ top: "stage4_8/conv3"
1112
+ param {
1113
+ lr_mult: 1.0
1114
+ decay_mult: 1.0
1115
+ }
1116
+ convolution_param {
1117
+ num_output: 128
1118
+ pad: 0
1119
+ kernel_size: 1
1120
+ group: 1
1121
+ stride: 1
1122
+ weight_filler {
1123
+ type: "msra"
1124
+ }
1125
+ dilation: 1
1126
+ }
1127
+ }
1128
+ layer {
1129
+ name: "stage4_8/sum"
1130
+ type: "Eltwise"
1131
+ bottom: "stage4_7/sum"
1132
+ bottom: "stage4_8/conv3"
1133
+ top: "stage4_8/sum"
1134
+ eltwise_param {
1135
+ operation: SUM
1136
+ }
1137
+ }
1138
+ layer {
1139
+ name: "stage4_8/relu"
1140
+ type: "ReLU"
1141
+ bottom: "stage4_8/sum"
1142
+ top: "stage4_8/sum"
1143
+ }
1144
+ layer {
1145
+ name: "stage5_1/conv1"
1146
+ type: "Convolution"
1147
+ bottom: "stage4_8/sum"
1148
+ top: "stage5_1/conv1"
1149
+ param {
1150
+ lr_mult: 1.0
1151
+ decay_mult: 1.0
1152
+ }
1153
+ convolution_param {
1154
+ num_output: 32
1155
+ pad: 0
1156
+ kernel_size: 1
1157
+ group: 1
1158
+ stride: 1
1159
+ weight_filler {
1160
+ type: "msra"
1161
+ }
1162
+ dilation: 1
1163
+ }
1164
+ }
1165
+ layer {
1166
+ name: "stage5_1/conv1/relu"
1167
+ type: "ReLU"
1168
+ bottom: "stage5_1/conv1"
1169
+ top: "stage5_1/conv1"
1170
+ }
1171
+ layer {
1172
+ name: "stage5_1/conv2"
1173
+ type: "Convolution"
1174
+ bottom: "stage5_1/conv1"
1175
+ top: "stage5_1/conv2"
1176
+ param {
1177
+ lr_mult: 1.0
1178
+ decay_mult: 1.0
1179
+ }
1180
+ convolution_param {
1181
+ num_output: 32
1182
+ pad: 2
1183
+ kernel_size: 3
1184
+ group: 32
1185
+ stride: 2
1186
+ weight_filler {
1187
+ type: "msra"
1188
+ }
1189
+ dilation: 2
1190
+ }
1191
+ }
1192
+ layer {
1193
+ name: "stage5_1/conv3"
1194
+ type: "Convolution"
1195
+ bottom: "stage5_1/conv2"
1196
+ top: "stage5_1/conv3"
1197
+ param {
1198
+ lr_mult: 1.0
1199
+ decay_mult: 1.0
1200
+ }
1201
+ convolution_param {
1202
+ num_output: 128
1203
+ pad: 0
1204
+ kernel_size: 1
1205
+ group: 1
1206
+ stride: 1
1207
+ weight_filler {
1208
+ type: "msra"
1209
+ }
1210
+ dilation: 1
1211
+ }
1212
+ }
1213
+ layer {
1214
+ name: "stage5_1/relu"
1215
+ type: "ReLU"
1216
+ bottom: "stage5_1/conv3"
1217
+ top: "stage5_1/conv3"
1218
+ }
1219
+ layer {
1220
+ name: "stage5_2/conv1"
1221
+ type: "Convolution"
1222
+ bottom: "stage5_1/conv3"
1223
+ top: "stage5_2/conv1"
1224
+ param {
1225
+ lr_mult: 1.0
1226
+ decay_mult: 1.0
1227
+ }
1228
+ convolution_param {
1229
+ num_output: 32
1230
+ pad: 0
1231
+ kernel_size: 1
1232
+ group: 1
1233
+ stride: 1
1234
+ weight_filler {
1235
+ type: "msra"
1236
+ }
1237
+ dilation: 1
1238
+ }
1239
+ }
1240
+ layer {
1241
+ name: "stage5_2/conv1/relu"
1242
+ type: "ReLU"
1243
+ bottom: "stage5_2/conv1"
1244
+ top: "stage5_2/conv1"
1245
+ }
1246
+ layer {
1247
+ name: "stage5_2/conv2"
1248
+ type: "Convolution"
1249
+ bottom: "stage5_2/conv1"
1250
+ top: "stage5_2/conv2"
1251
+ param {
1252
+ lr_mult: 1.0
1253
+ decay_mult: 1.0
1254
+ }
1255
+ convolution_param {
1256
+ num_output: 32
1257
+ pad: 2
1258
+ kernel_size: 3
1259
+ group: 32
1260
+ stride: 1
1261
+ weight_filler {
1262
+ type: "msra"
1263
+ }
1264
+ dilation: 2
1265
+ }
1266
+ }
1267
+ layer {
1268
+ name: "stage5_2/conv3"
1269
+ type: "Convolution"
1270
+ bottom: "stage5_2/conv2"
1271
+ top: "stage5_2/conv3"
1272
+ param {
1273
+ lr_mult: 1.0
1274
+ decay_mult: 1.0
1275
+ }
1276
+ convolution_param {
1277
+ num_output: 128
1278
+ pad: 0
1279
+ kernel_size: 1
1280
+ group: 1
1281
+ stride: 1
1282
+ weight_filler {
1283
+ type: "msra"
1284
+ }
1285
+ dilation: 1
1286
+ }
1287
+ }
1288
+ layer {
1289
+ name: "stage5_2/sum"
1290
+ type: "Eltwise"
1291
+ bottom: "stage5_1/conv3"
1292
+ bottom: "stage5_2/conv3"
1293
+ top: "stage5_2/sum"
1294
+ eltwise_param {
1295
+ operation: SUM
1296
+ }
1297
+ }
1298
+ layer {
1299
+ name: "stage5_2/relu"
1300
+ type: "ReLU"
1301
+ bottom: "stage5_2/sum"
1302
+ top: "stage5_2/sum"
1303
+ }
1304
+ layer {
1305
+ name: "stage5_3/conv1"
1306
+ type: "Convolution"
1307
+ bottom: "stage5_2/sum"
1308
+ top: "stage5_3/conv1"
1309
+ param {
1310
+ lr_mult: 1.0
1311
+ decay_mult: 1.0
1312
+ }
1313
+ convolution_param {
1314
+ num_output: 32
1315
+ pad: 0
1316
+ kernel_size: 1
1317
+ group: 1
1318
+ stride: 1
1319
+ weight_filler {
1320
+ type: "msra"
1321
+ }
1322
+ dilation: 1
1323
+ }
1324
+ }
1325
+ layer {
1326
+ name: "stage5_3/conv1/relu"
1327
+ type: "ReLU"
1328
+ bottom: "stage5_3/conv1"
1329
+ top: "stage5_3/conv1"
1330
+ }
1331
+ layer {
1332
+ name: "stage5_3/conv2"
1333
+ type: "Convolution"
1334
+ bottom: "stage5_3/conv1"
1335
+ top: "stage5_3/conv2"
1336
+ param {
1337
+ lr_mult: 1.0
1338
+ decay_mult: 1.0
1339
+ }
1340
+ convolution_param {
1341
+ num_output: 32
1342
+ pad: 2
1343
+ kernel_size: 3
1344
+ group: 32
1345
+ stride: 1
1346
+ weight_filler {
1347
+ type: "msra"
1348
+ }
1349
+ dilation: 2
1350
+ }
1351
+ }
1352
+ layer {
1353
+ name: "stage5_3/conv3"
1354
+ type: "Convolution"
1355
+ bottom: "stage5_3/conv2"
1356
+ top: "stage5_3/conv3"
1357
+ param {
1358
+ lr_mult: 1.0
1359
+ decay_mult: 1.0
1360
+ }
1361
+ convolution_param {
1362
+ num_output: 128
1363
+ pad: 0
1364
+ kernel_size: 1
1365
+ group: 1
1366
+ stride: 1
1367
+ weight_filler {
1368
+ type: "msra"
1369
+ }
1370
+ dilation: 1
1371
+ }
1372
+ }
1373
+ layer {
1374
+ name: "stage5_3/sum"
1375
+ type: "Eltwise"
1376
+ bottom: "stage5_2/sum"
1377
+ bottom: "stage5_3/conv3"
1378
+ top: "stage5_3/sum"
1379
+ eltwise_param {
1380
+ operation: SUM
1381
+ }
1382
+ }
1383
+ layer {
1384
+ name: "stage5_3/relu"
1385
+ type: "ReLU"
1386
+ bottom: "stage5_3/sum"
1387
+ top: "stage5_3/sum"
1388
+ }
1389
+ layer {
1390
+ name: "stage5_4/conv1"
1391
+ type: "Convolution"
1392
+ bottom: "stage5_3/sum"
1393
+ top: "stage5_4/conv1"
1394
+ param {
1395
+ lr_mult: 1.0
1396
+ decay_mult: 1.0
1397
+ }
1398
+ convolution_param {
1399
+ num_output: 32
1400
+ pad: 0
1401
+ kernel_size: 1
1402
+ group: 1
1403
+ stride: 1
1404
+ weight_filler {
1405
+ type: "msra"
1406
+ }
1407
+ dilation: 1
1408
+ }
1409
+ }
1410
+ layer {
1411
+ name: "stage5_4/conv1/relu"
1412
+ type: "ReLU"
1413
+ bottom: "stage5_4/conv1"
1414
+ top: "stage5_4/conv1"
1415
+ }
1416
+ layer {
1417
+ name: "stage5_4/conv2"
1418
+ type: "Convolution"
1419
+ bottom: "stage5_4/conv1"
1420
+ top: "stage5_4/conv2"
1421
+ param {
1422
+ lr_mult: 1.0
1423
+ decay_mult: 1.0
1424
+ }
1425
+ convolution_param {
1426
+ num_output: 32
1427
+ pad: 2
1428
+ kernel_size: 3
1429
+ group: 32
1430
+ stride: 1
1431
+ weight_filler {
1432
+ type: "msra"
1433
+ }
1434
+ dilation: 2
1435
+ }
1436
+ }
1437
+ layer {
1438
+ name: "stage5_4/conv3"
1439
+ type: "Convolution"
1440
+ bottom: "stage5_4/conv2"
1441
+ top: "stage5_4/conv3"
1442
+ param {
1443
+ lr_mult: 1.0
1444
+ decay_mult: 1.0
1445
+ }
1446
+ convolution_param {
1447
+ num_output: 128
1448
+ pad: 0
1449
+ kernel_size: 1
1450
+ group: 1
1451
+ stride: 1
1452
+ weight_filler {
1453
+ type: "msra"
1454
+ }
1455
+ dilation: 1
1456
+ }
1457
+ }
1458
+ layer {
1459
+ name: "stage5_4/sum"
1460
+ type: "Eltwise"
1461
+ bottom: "stage5_3/sum"
1462
+ bottom: "stage5_4/conv3"
1463
+ top: "stage5_4/sum"
1464
+ eltwise_param {
1465
+ operation: SUM
1466
+ }
1467
+ }
1468
+ layer {
1469
+ name: "stage5_4/relu"
1470
+ type: "ReLU"
1471
+ bottom: "stage5_4/sum"
1472
+ top: "stage5_4/sum"
1473
+ }
1474
+ layer {
1475
+ name: "stage6_1/conv4"
1476
+ type: "Convolution"
1477
+ bottom: "stage5_4/sum"
1478
+ top: "stage6_1/conv4"
1479
+ param {
1480
+ lr_mult: 1.0
1481
+ decay_mult: 1.0
1482
+ }
1483
+ convolution_param {
1484
+ num_output: 128
1485
+ pad: 0
1486
+ kernel_size: 1
1487
+ group: 1
1488
+ stride: 1
1489
+ weight_filler {
1490
+ type: "msra"
1491
+ }
1492
+ dilation: 1
1493
+ }
1494
+ }
1495
+ layer {
1496
+ name: "stage6_1/conv1"
1497
+ type: "Convolution"
1498
+ bottom: "stage5_4/sum"
1499
+ top: "stage6_1/conv1"
1500
+ param {
1501
+ lr_mult: 1.0
1502
+ decay_mult: 1.0
1503
+ }
1504
+ convolution_param {
1505
+ num_output: 32
1506
+ pad: 0
1507
+ kernel_size: 1
1508
+ group: 1
1509
+ stride: 1
1510
+ weight_filler {
1511
+ type: "msra"
1512
+ }
1513
+ dilation: 1
1514
+ }
1515
+ }
1516
+ layer {
1517
+ name: "stage6_1/conv1/relu"
1518
+ type: "ReLU"
1519
+ bottom: "stage6_1/conv1"
1520
+ top: "stage6_1/conv1"
1521
+ }
1522
+ layer {
1523
+ name: "stage6_1/conv2"
1524
+ type: "Convolution"
1525
+ bottom: "stage6_1/conv1"
1526
+ top: "stage6_1/conv2"
1527
+ param {
1528
+ lr_mult: 1.0
1529
+ decay_mult: 1.0
1530
+ }
1531
+ convolution_param {
1532
+ num_output: 32
1533
+ pad: 2
1534
+ kernel_size: 3
1535
+ group: 32
1536
+ stride: 1
1537
+ weight_filler {
1538
+ type: "msra"
1539
+ }
1540
+ dilation: 2
1541
+ }
1542
+ }
1543
+ layer {
1544
+ name: "stage6_1/conv3"
1545
+ type: "Convolution"
1546
+ bottom: "stage6_1/conv2"
1547
+ top: "stage6_1/conv3"
1548
+ param {
1549
+ lr_mult: 1.0
1550
+ decay_mult: 1.0
1551
+ }
1552
+ convolution_param {
1553
+ num_output: 128
1554
+ pad: 0
1555
+ kernel_size: 1
1556
+ group: 1
1557
+ stride: 1
1558
+ weight_filler {
1559
+ type: "msra"
1560
+ }
1561
+ dilation: 1
1562
+ }
1563
+ }
1564
+ layer {
1565
+ name: "stage6_1/sum"
1566
+ type: "Eltwise"
1567
+ bottom: "stage6_1/conv4"
1568
+ bottom: "stage6_1/conv3"
1569
+ top: "stage6_1/sum"
1570
+ eltwise_param {
1571
+ operation: SUM
1572
+ }
1573
+ }
1574
+ layer {
1575
+ name: "stage6_1/relu"
1576
+ type: "ReLU"
1577
+ bottom: "stage6_1/sum"
1578
+ top: "stage6_1/sum"
1579
+ }
1580
+ layer {
1581
+ name: "stage6_2/conv1"
1582
+ type: "Convolution"
1583
+ bottom: "stage6_1/sum"
1584
+ top: "stage6_2/conv1"
1585
+ param {
1586
+ lr_mult: 1.0
1587
+ decay_mult: 1.0
1588
+ }
1589
+ convolution_param {
1590
+ num_output: 32
1591
+ pad: 0
1592
+ kernel_size: 1
1593
+ group: 1
1594
+ stride: 1
1595
+ weight_filler {
1596
+ type: "msra"
1597
+ }
1598
+ dilation: 1
1599
+ }
1600
+ }
1601
+ layer {
1602
+ name: "stage6_2/conv1/relu"
1603
+ type: "ReLU"
1604
+ bottom: "stage6_2/conv1"
1605
+ top: "stage6_2/conv1"
1606
+ }
1607
+ layer {
1608
+ name: "stage6_2/conv2"
1609
+ type: "Convolution"
1610
+ bottom: "stage6_2/conv1"
1611
+ top: "stage6_2/conv2"
1612
+ param {
1613
+ lr_mult: 1.0
1614
+ decay_mult: 1.0
1615
+ }
1616
+ convolution_param {
1617
+ num_output: 32
1618
+ pad: 2
1619
+ kernel_size: 3
1620
+ group: 32
1621
+ stride: 1
1622
+ weight_filler {
1623
+ type: "msra"
1624
+ }
1625
+ dilation: 2
1626
+ }
1627
+ }
1628
+ layer {
1629
+ name: "stage6_2/conv3"
1630
+ type: "Convolution"
1631
+ bottom: "stage6_2/conv2"
1632
+ top: "stage6_2/conv3"
1633
+ param {
1634
+ lr_mult: 1.0
1635
+ decay_mult: 1.0
1636
+ }
1637
+ convolution_param {
1638
+ num_output: 128
1639
+ pad: 0
1640
+ kernel_size: 1
1641
+ group: 1
1642
+ stride: 1
1643
+ weight_filler {
1644
+ type: "msra"
1645
+ }
1646
+ dilation: 1
1647
+ }
1648
+ }
1649
+ layer {
1650
+ name: "stage6_2/sum"
1651
+ type: "Eltwise"
1652
+ bottom: "stage6_1/sum"
1653
+ bottom: "stage6_2/conv3"
1654
+ top: "stage6_2/sum"
1655
+ eltwise_param {
1656
+ operation: SUM
1657
+ }
1658
+ }
1659
+ layer {
1660
+ name: "stage6_2/relu"
1661
+ type: "ReLU"
1662
+ bottom: "stage6_2/sum"
1663
+ top: "stage6_2/sum"
1664
+ }
1665
+ layer {
1666
+ name: "stage7_1/conv4"
1667
+ type: "Convolution"
1668
+ bottom: "stage6_2/sum"
1669
+ top: "stage7_1/conv4"
1670
+ param {
1671
+ lr_mult: 1.0
1672
+ decay_mult: 1.0
1673
+ }
1674
+ convolution_param {
1675
+ num_output: 128
1676
+ pad: 0
1677
+ kernel_size: 1
1678
+ group: 1
1679
+ stride: 1
1680
+ weight_filler {
1681
+ type: "msra"
1682
+ }
1683
+ dilation: 1
1684
+ }
1685
+ }
1686
+ layer {
1687
+ name: "stage7_1/conv1"
1688
+ type: "Convolution"
1689
+ bottom: "stage6_2/sum"
1690
+ top: "stage7_1/conv1"
1691
+ param {
1692
+ lr_mult: 1.0
1693
+ decay_mult: 1.0
1694
+ }
1695
+ convolution_param {
1696
+ num_output: 32
1697
+ pad: 0
1698
+ kernel_size: 1
1699
+ group: 1
1700
+ stride: 1
1701
+ weight_filler {
1702
+ type: "msra"
1703
+ }
1704
+ dilation: 1
1705
+ }
1706
+ }
1707
+ layer {
1708
+ name: "stage7_1/conv1/relu"
1709
+ type: "ReLU"
1710
+ bottom: "stage7_1/conv1"
1711
+ top: "stage7_1/conv1"
1712
+ }
1713
+ layer {
1714
+ name: "stage7_1/conv2"
1715
+ type: "Convolution"
1716
+ bottom: "stage7_1/conv1"
1717
+ top: "stage7_1/conv2"
1718
+ param {
1719
+ lr_mult: 1.0
1720
+ decay_mult: 1.0
1721
+ }
1722
+ convolution_param {
1723
+ num_output: 32
1724
+ pad: 2
1725
+ kernel_size: 3
1726
+ group: 32
1727
+ stride: 1
1728
+ weight_filler {
1729
+ type: "msra"
1730
+ }
1731
+ dilation: 2
1732
+ }
1733
+ }
1734
+ layer {
1735
+ name: "stage7_1/conv3"
1736
+ type: "Convolution"
1737
+ bottom: "stage7_1/conv2"
1738
+ top: "stage7_1/conv3"
1739
+ param {
1740
+ lr_mult: 1.0
1741
+ decay_mult: 1.0
1742
+ }
1743
+ convolution_param {
1744
+ num_output: 128
1745
+ pad: 0
1746
+ kernel_size: 1
1747
+ group: 1
1748
+ stride: 1
1749
+ weight_filler {
1750
+ type: "msra"
1751
+ }
1752
+ dilation: 1
1753
+ }
1754
+ }
1755
+ layer {
1756
+ name: "stage7_1/sum"
1757
+ type: "Eltwise"
1758
+ bottom: "stage7_1/conv4"
1759
+ bottom: "stage7_1/conv3"
1760
+ top: "stage7_1/sum"
1761
+ eltwise_param {
1762
+ operation: SUM
1763
+ }
1764
+ }
1765
+ layer {
1766
+ name: "stage7_1/relu"
1767
+ type: "ReLU"
1768
+ bottom: "stage7_1/sum"
1769
+ top: "stage7_1/sum"
1770
+ }
1771
+ layer {
1772
+ name: "stage7_2/conv1"
1773
+ type: "Convolution"
1774
+ bottom: "stage7_1/sum"
1775
+ top: "stage7_2/conv1"
1776
+ param {
1777
+ lr_mult: 1.0
1778
+ decay_mult: 1.0
1779
+ }
1780
+ convolution_param {
1781
+ num_output: 32
1782
+ pad: 0
1783
+ kernel_size: 1
1784
+ group: 1
1785
+ stride: 1
1786
+ weight_filler {
1787
+ type: "msra"
1788
+ }
1789
+ dilation: 1
1790
+ }
1791
+ }
1792
+ layer {
1793
+ name: "stage7_2/conv1/relu"
1794
+ type: "ReLU"
1795
+ bottom: "stage7_2/conv1"
1796
+ top: "stage7_2/conv1"
1797
+ }
1798
+ layer {
1799
+ name: "stage7_2/conv2"
1800
+ type: "Convolution"
1801
+ bottom: "stage7_2/conv1"
1802
+ top: "stage7_2/conv2"
1803
+ param {
1804
+ lr_mult: 1.0
1805
+ decay_mult: 1.0
1806
+ }
1807
+ convolution_param {
1808
+ num_output: 32
1809
+ pad: 2
1810
+ kernel_size: 3
1811
+ group: 32
1812
+ stride: 1
1813
+ weight_filler {
1814
+ type: "msra"
1815
+ }
1816
+ dilation: 2
1817
+ }
1818
+ }
1819
+ layer {
1820
+ name: "stage7_2/conv3"
1821
+ type: "Convolution"
1822
+ bottom: "stage7_2/conv2"
1823
+ top: "stage7_2/conv3"
1824
+ param {
1825
+ lr_mult: 1.0
1826
+ decay_mult: 1.0
1827
+ }
1828
+ convolution_param {
1829
+ num_output: 128
1830
+ pad: 0
1831
+ kernel_size: 1
1832
+ group: 1
1833
+ stride: 1
1834
+ weight_filler {
1835
+ type: "msra"
1836
+ }
1837
+ dilation: 1
1838
+ }
1839
+ }
1840
+ layer {
1841
+ name: "stage7_2/sum"
1842
+ type: "Eltwise"
1843
+ bottom: "stage7_1/sum"
1844
+ bottom: "stage7_2/conv3"
1845
+ top: "stage7_2/sum"
1846
+ eltwise_param {
1847
+ operation: SUM
1848
+ }
1849
+ }
1850
+ layer {
1851
+ name: "stage7_2/relu"
1852
+ type: "ReLU"
1853
+ bottom: "stage7_2/sum"
1854
+ top: "stage7_2/sum"
1855
+ }
1856
+ layer {
1857
+ name: "stage8_1/conv4"
1858
+ type: "Convolution"
1859
+ bottom: "stage7_2/sum"
1860
+ top: "stage8_1/conv4"
1861
+ param {
1862
+ lr_mult: 1.0
1863
+ decay_mult: 1.0
1864
+ }
1865
+ convolution_param {
1866
+ num_output: 128
1867
+ pad: 0
1868
+ kernel_size: 1
1869
+ group: 1
1870
+ stride: 1
1871
+ weight_filler {
1872
+ type: "msra"
1873
+ }
1874
+ dilation: 1
1875
+ }
1876
+ }
1877
+ layer {
1878
+ name: "stage8_1/conv1"
1879
+ type: "Convolution"
1880
+ bottom: "stage7_2/sum"
1881
+ top: "stage8_1/conv1"
1882
+ param {
1883
+ lr_mult: 1.0
1884
+ decay_mult: 1.0
1885
+ }
1886
+ convolution_param {
1887
+ num_output: 32
1888
+ pad: 0
1889
+ kernel_size: 1
1890
+ group: 1
1891
+ stride: 1
1892
+ weight_filler {
1893
+ type: "msra"
1894
+ }
1895
+ dilation: 1
1896
+ }
1897
+ }
1898
+ layer {
1899
+ name: "stage8_1/conv1/relu"
1900
+ type: "ReLU"
1901
+ bottom: "stage8_1/conv1"
1902
+ top: "stage8_1/conv1"
1903
+ }
1904
+ layer {
1905
+ name: "stage8_1/conv2"
1906
+ type: "Convolution"
1907
+ bottom: "stage8_1/conv1"
1908
+ top: "stage8_1/conv2"
1909
+ param {
1910
+ lr_mult: 1.0
1911
+ decay_mult: 1.0
1912
+ }
1913
+ convolution_param {
1914
+ num_output: 32
1915
+ pad: 2
1916
+ kernel_size: 3
1917
+ group: 32
1918
+ stride: 1
1919
+ weight_filler {
1920
+ type: "msra"
1921
+ }
1922
+ dilation: 2
1923
+ }
1924
+ }
1925
+ layer {
1926
+ name: "stage8_1/conv3"
1927
+ type: "Convolution"
1928
+ bottom: "stage8_1/conv2"
1929
+ top: "stage8_1/conv3"
1930
+ param {
1931
+ lr_mult: 1.0
1932
+ decay_mult: 1.0
1933
+ }
1934
+ convolution_param {
1935
+ num_output: 128
1936
+ pad: 0
1937
+ kernel_size: 1
1938
+ group: 1
1939
+ stride: 1
1940
+ weight_filler {
1941
+ type: "msra"
1942
+ }
1943
+ dilation: 1
1944
+ }
1945
+ }
1946
+ layer {
1947
+ name: "stage8_1/sum"
1948
+ type: "Eltwise"
1949
+ bottom: "stage8_1/conv4"
1950
+ bottom: "stage8_1/conv3"
1951
+ top: "stage8_1/sum"
1952
+ eltwise_param {
1953
+ operation: SUM
1954
+ }
1955
+ }
1956
+ layer {
1957
+ name: "stage8_1/relu"
1958
+ type: "ReLU"
1959
+ bottom: "stage8_1/sum"
1960
+ top: "stage8_1/sum"
1961
+ }
1962
+ layer {
1963
+ name: "stage8_2/conv1"
1964
+ type: "Convolution"
1965
+ bottom: "stage8_1/sum"
1966
+ top: "stage8_2/conv1"
1967
+ param {
1968
+ lr_mult: 1.0
1969
+ decay_mult: 1.0
1970
+ }
1971
+ convolution_param {
1972
+ num_output: 32
1973
+ pad: 0
1974
+ kernel_size: 1
1975
+ group: 1
1976
+ stride: 1
1977
+ weight_filler {
1978
+ type: "msra"
1979
+ }
1980
+ dilation: 1
1981
+ }
1982
+ }
1983
+ layer {
1984
+ name: "stage8_2/conv1/relu"
1985
+ type: "ReLU"
1986
+ bottom: "stage8_2/conv1"
1987
+ top: "stage8_2/conv1"
1988
+ }
1989
+ layer {
1990
+ name: "stage8_2/conv2"
1991
+ type: "Convolution"
1992
+ bottom: "stage8_2/conv1"
1993
+ top: "stage8_2/conv2"
1994
+ param {
1995
+ lr_mult: 1.0
1996
+ decay_mult: 1.0
1997
+ }
1998
+ convolution_param {
1999
+ num_output: 32
2000
+ pad: 2
2001
+ kernel_size: 3
2002
+ group: 32
2003
+ stride: 1
2004
+ weight_filler {
2005
+ type: "msra"
2006
+ }
2007
+ dilation: 2
2008
+ }
2009
+ }
2010
+ layer {
2011
+ name: "stage8_2/conv3"
2012
+ type: "Convolution"
2013
+ bottom: "stage8_2/conv2"
2014
+ top: "stage8_2/conv3"
2015
+ param {
2016
+ lr_mult: 1.0
2017
+ decay_mult: 1.0
2018
+ }
2019
+ convolution_param {
2020
+ num_output: 128
2021
+ pad: 0
2022
+ kernel_size: 1
2023
+ group: 1
2024
+ stride: 1
2025
+ weight_filler {
2026
+ type: "msra"
2027
+ }
2028
+ dilation: 1
2029
+ }
2030
+ }
2031
+ layer {
2032
+ name: "stage8_2/sum"
2033
+ type: "Eltwise"
2034
+ bottom: "stage8_1/sum"
2035
+ bottom: "stage8_2/conv3"
2036
+ top: "stage8_2/sum"
2037
+ eltwise_param {
2038
+ operation: SUM
2039
+ }
2040
+ }
2041
+ layer {
2042
+ name: "stage8_2/relu"
2043
+ type: "ReLU"
2044
+ bottom: "stage8_2/sum"
2045
+ top: "stage8_2/sum"
2046
+ }
2047
+ layer {
2048
+ name: "cls1/conv"
2049
+ type: "Convolution"
2050
+ bottom: "stage4_8/sum"
2051
+ top: "cls1/conv"
2052
+ param {
2053
+ lr_mult: 1.0
2054
+ decay_mult: 1.0
2055
+ }
2056
+ param {
2057
+ lr_mult: 1.0
2058
+ decay_mult: 0.0
2059
+ }
2060
+ convolution_param {
2061
+ num_output: 12
2062
+ bias_term: true
2063
+ pad: 0
2064
+ kernel_size: 1
2065
+ group: 1
2066
+ stride: 1
2067
+ weight_filler {
2068
+ type: "msra"
2069
+ }
2070
+ dilation: 1
2071
+ }
2072
+ }
2073
+ layer {
2074
+ name: "cls1/permute"
2075
+ type: "Permute"
2076
+ bottom: "cls1/conv"
2077
+ top: "cls1/permute"
2078
+ permute_param {
2079
+ order: 0
2080
+ order: 2
2081
+ order: 3
2082
+ order: 1
2083
+ }
2084
+ }
2085
+ layer {
2086
+ name: "cls1/flatten"
2087
+ type: "Flatten"
2088
+ bottom: "cls1/permute"
2089
+ top: "cls1/flatten"
2090
+ flatten_param {
2091
+ axis: 1
2092
+ }
2093
+ }
2094
+ layer {
2095
+ name: "loc1/conv"
2096
+ type: "Convolution"
2097
+ bottom: "stage4_8/sum"
2098
+ top: "loc1/conv"
2099
+ param {
2100
+ lr_mult: 1.0
2101
+ decay_mult: 1.0
2102
+ }
2103
+ param {
2104
+ lr_mult: 1.0
2105
+ decay_mult: 0.0
2106
+ }
2107
+ convolution_param {
2108
+ num_output: 24
2109
+ bias_term: true
2110
+ pad: 0
2111
+ kernel_size: 1
2112
+ group: 1
2113
+ stride: 1
2114
+ weight_filler {
2115
+ type: "msra"
2116
+ }
2117
+ dilation: 1
2118
+ }
2119
+ }
2120
+ layer {
2121
+ name: "loc1/permute"
2122
+ type: "Permute"
2123
+ bottom: "loc1/conv"
2124
+ top: "loc1/permute"
2125
+ permute_param {
2126
+ order: 0
2127
+ order: 2
2128
+ order: 3
2129
+ order: 1
2130
+ }
2131
+ }
2132
+ layer {
2133
+ name: "loc1/flatten"
2134
+ type: "Flatten"
2135
+ bottom: "loc1/permute"
2136
+ top: "loc1/flatten"
2137
+ flatten_param {
2138
+ axis: 1
2139
+ }
2140
+ }
2141
+ layer {
2142
+ name: "stage4_8/sum/prior_box"
2143
+ type: "PriorBox"
2144
+ bottom: "stage4_8/sum"
2145
+ bottom: "data"
2146
+ top: "stage4_8/sum/prior_box"
2147
+ prior_box_param {
2148
+ min_size: 50.0
2149
+ max_size: 100.0
2150
+ aspect_ratio: 2.0
2151
+ aspect_ratio: 0.5
2152
+ aspect_ratio: 3.0
2153
+ aspect_ratio: 0.3333333432674408
2154
+ flip: false
2155
+ clip: false
2156
+ variance: 0.10000000149011612
2157
+ variance: 0.10000000149011612
2158
+ variance: 0.20000000298023224
2159
+ variance: 0.20000000298023224
2160
+ step: 16.0
2161
+ }
2162
+ }
2163
+ layer {
2164
+ name: "cls2/conv"
2165
+ type: "Convolution"
2166
+ bottom: "stage5_4/sum"
2167
+ top: "cls2/conv"
2168
+ param {
2169
+ lr_mult: 1.0
2170
+ decay_mult: 1.0
2171
+ }
2172
+ param {
2173
+ lr_mult: 1.0
2174
+ decay_mult: 0.0
2175
+ }
2176
+ convolution_param {
2177
+ num_output: 12
2178
+ bias_term: true
2179
+ pad: 0
2180
+ kernel_size: 1
2181
+ group: 1
2182
+ stride: 1
2183
+ weight_filler {
2184
+ type: "msra"
2185
+ }
2186
+ dilation: 1
2187
+ }
2188
+ }
2189
+ layer {
2190
+ name: "cls2/permute"
2191
+ type: "Permute"
2192
+ bottom: "cls2/conv"
2193
+ top: "cls2/permute"
2194
+ permute_param {
2195
+ order: 0
2196
+ order: 2
2197
+ order: 3
2198
+ order: 1
2199
+ }
2200
+ }
2201
+ layer {
2202
+ name: "cls2/flatten"
2203
+ type: "Flatten"
2204
+ bottom: "cls2/permute"
2205
+ top: "cls2/flatten"
2206
+ flatten_param {
2207
+ axis: 1
2208
+ }
2209
+ }
2210
+ layer {
2211
+ name: "loc2/conv"
2212
+ type: "Convolution"
2213
+ bottom: "stage5_4/sum"
2214
+ top: "loc2/conv"
2215
+ param {
2216
+ lr_mult: 1.0
2217
+ decay_mult: 1.0
2218
+ }
2219
+ param {
2220
+ lr_mult: 1.0
2221
+ decay_mult: 0.0
2222
+ }
2223
+ convolution_param {
2224
+ num_output: 24
2225
+ bias_term: true
2226
+ pad: 0
2227
+ kernel_size: 1
2228
+ group: 1
2229
+ stride: 1
2230
+ weight_filler {
2231
+ type: "msra"
2232
+ }
2233
+ dilation: 1
2234
+ }
2235
+ }
2236
+ layer {
2237
+ name: "loc2/permute"
2238
+ type: "Permute"
2239
+ bottom: "loc2/conv"
2240
+ top: "loc2/permute"
2241
+ permute_param {
2242
+ order: 0
2243
+ order: 2
2244
+ order: 3
2245
+ order: 1
2246
+ }
2247
+ }
2248
+ layer {
2249
+ name: "loc2/flatten"
2250
+ type: "Flatten"
2251
+ bottom: "loc2/permute"
2252
+ top: "loc2/flatten"
2253
+ flatten_param {
2254
+ axis: 1
2255
+ }
2256
+ }
2257
+ layer {
2258
+ name: "stage5_4/sum/prior_box"
2259
+ type: "PriorBox"
2260
+ bottom: "stage5_4/sum"
2261
+ bottom: "data"
2262
+ top: "stage5_4/sum/prior_box"
2263
+ prior_box_param {
2264
+ min_size: 100.0
2265
+ max_size: 150.0
2266
+ aspect_ratio: 2.0
2267
+ aspect_ratio: 0.5
2268
+ aspect_ratio: 3.0
2269
+ aspect_ratio: 0.3333333432674408
2270
+ flip: false
2271
+ clip: false
2272
+ variance: 0.10000000149011612
2273
+ variance: 0.10000000149011612
2274
+ variance: 0.20000000298023224
2275
+ variance: 0.20000000298023224
2276
+ step: 32.0
2277
+ }
2278
+ }
2279
+ layer {
2280
+ name: "cls3/conv"
2281
+ type: "Convolution"
2282
+ bottom: "stage6_2/sum"
2283
+ top: "cls3/conv"
2284
+ param {
2285
+ lr_mult: 1.0
2286
+ decay_mult: 1.0
2287
+ }
2288
+ param {
2289
+ lr_mult: 1.0
2290
+ decay_mult: 0.0
2291
+ }
2292
+ convolution_param {
2293
+ num_output: 12
2294
+ bias_term: true
2295
+ pad: 0
2296
+ kernel_size: 1
2297
+ group: 1
2298
+ stride: 1
2299
+ weight_filler {
2300
+ type: "msra"
2301
+ }
2302
+ dilation: 1
2303
+ }
2304
+ }
2305
+ layer {
2306
+ name: "cls3/permute"
2307
+ type: "Permute"
2308
+ bottom: "cls3/conv"
2309
+ top: "cls3/permute"
2310
+ permute_param {
2311
+ order: 0
2312
+ order: 2
2313
+ order: 3
2314
+ order: 1
2315
+ }
2316
+ }
2317
+ layer {
2318
+ name: "cls3/flatten"
2319
+ type: "Flatten"
2320
+ bottom: "cls3/permute"
2321
+ top: "cls3/flatten"
2322
+ flatten_param {
2323
+ axis: 1
2324
+ }
2325
+ }
2326
+ layer {
2327
+ name: "loc3/conv"
2328
+ type: "Convolution"
2329
+ bottom: "stage6_2/sum"
2330
+ top: "loc3/conv"
2331
+ param {
2332
+ lr_mult: 1.0
2333
+ decay_mult: 1.0
2334
+ }
2335
+ param {
2336
+ lr_mult: 1.0
2337
+ decay_mult: 0.0
2338
+ }
2339
+ convolution_param {
2340
+ num_output: 24
2341
+ bias_term: true
2342
+ pad: 0
2343
+ kernel_size: 1
2344
+ group: 1
2345
+ stride: 1
2346
+ weight_filler {
2347
+ type: "msra"
2348
+ }
2349
+ dilation: 1
2350
+ }
2351
+ }
2352
+ layer {
2353
+ name: "loc3/permute"
2354
+ type: "Permute"
2355
+ bottom: "loc3/conv"
2356
+ top: "loc3/permute"
2357
+ permute_param {
2358
+ order: 0
2359
+ order: 2
2360
+ order: 3
2361
+ order: 1
2362
+ }
2363
+ }
2364
+ layer {
2365
+ name: "loc3/flatten"
2366
+ type: "Flatten"
2367
+ bottom: "loc3/permute"
2368
+ top: "loc3/flatten"
2369
+ flatten_param {
2370
+ axis: 1
2371
+ }
2372
+ }
2373
+ layer {
2374
+ name: "stage6_2/sum/prior_box"
2375
+ type: "PriorBox"
2376
+ bottom: "stage6_2/sum"
2377
+ bottom: "data"
2378
+ top: "stage6_2/sum/prior_box"
2379
+ prior_box_param {
2380
+ min_size: 150.0
2381
+ max_size: 200.0
2382
+ aspect_ratio: 2.0
2383
+ aspect_ratio: 0.5
2384
+ aspect_ratio: 3.0
2385
+ aspect_ratio: 0.3333333432674408
2386
+ flip: false
2387
+ clip: false
2388
+ variance: 0.10000000149011612
2389
+ variance: 0.10000000149011612
2390
+ variance: 0.20000000298023224
2391
+ variance: 0.20000000298023224
2392
+ step: 32.0
2393
+ }
2394
+ }
2395
+ layer {
2396
+ name: "cls4/conv"
2397
+ type: "Convolution"
2398
+ bottom: "stage7_2/sum"
2399
+ top: "cls4/conv"
2400
+ param {
2401
+ lr_mult: 1.0
2402
+ decay_mult: 1.0
2403
+ }
2404
+ param {
2405
+ lr_mult: 1.0
2406
+ decay_mult: 0.0
2407
+ }
2408
+ convolution_param {
2409
+ num_output: 12
2410
+ bias_term: true
2411
+ pad: 0
2412
+ kernel_size: 1
2413
+ group: 1
2414
+ stride: 1
2415
+ weight_filler {
2416
+ type: "msra"
2417
+ }
2418
+ dilation: 1
2419
+ }
2420
+ }
2421
+ layer {
2422
+ name: "cls4/permute"
2423
+ type: "Permute"
2424
+ bottom: "cls4/conv"
2425
+ top: "cls4/permute"
2426
+ permute_param {
2427
+ order: 0
2428
+ order: 2
2429
+ order: 3
2430
+ order: 1
2431
+ }
2432
+ }
2433
+ layer {
2434
+ name: "cls4/flatten"
2435
+ type: "Flatten"
2436
+ bottom: "cls4/permute"
2437
+ top: "cls4/flatten"
2438
+ flatten_param {
2439
+ axis: 1
2440
+ }
2441
+ }
2442
+ layer {
2443
+ name: "loc4/conv"
2444
+ type: "Convolution"
2445
+ bottom: "stage7_2/sum"
2446
+ top: "loc4/conv"
2447
+ param {
2448
+ lr_mult: 1.0
2449
+ decay_mult: 1.0
2450
+ }
2451
+ param {
2452
+ lr_mult: 1.0
2453
+ decay_mult: 0.0
2454
+ }
2455
+ convolution_param {
2456
+ num_output: 24
2457
+ bias_term: true
2458
+ pad: 0
2459
+ kernel_size: 1
2460
+ group: 1
2461
+ stride: 1
2462
+ weight_filler {
2463
+ type: "msra"
2464
+ }
2465
+ dilation: 1
2466
+ }
2467
+ }
2468
+ layer {
2469
+ name: "loc4/permute"
2470
+ type: "Permute"
2471
+ bottom: "loc4/conv"
2472
+ top: "loc4/permute"
2473
+ permute_param {
2474
+ order: 0
2475
+ order: 2
2476
+ order: 3
2477
+ order: 1
2478
+ }
2479
+ }
2480
+ layer {
2481
+ name: "loc4/flatten"
2482
+ type: "Flatten"
2483
+ bottom: "loc4/permute"
2484
+ top: "loc4/flatten"
2485
+ flatten_param {
2486
+ axis: 1
2487
+ }
2488
+ }
2489
+ layer {
2490
+ name: "stage7_2/sum/prior_box"
2491
+ type: "PriorBox"
2492
+ bottom: "stage7_2/sum"
2493
+ bottom: "data"
2494
+ top: "stage7_2/sum/prior_box"
2495
+ prior_box_param {
2496
+ min_size: 200.0
2497
+ max_size: 300.0
2498
+ aspect_ratio: 2.0
2499
+ aspect_ratio: 0.5
2500
+ aspect_ratio: 3.0
2501
+ aspect_ratio: 0.3333333432674408
2502
+ flip: false
2503
+ clip: false
2504
+ variance: 0.10000000149011612
2505
+ variance: 0.10000000149011612
2506
+ variance: 0.20000000298023224
2507
+ variance: 0.20000000298023224
2508
+ step: 32.0
2509
+ }
2510
+ }
2511
+ layer {
2512
+ name: "cls5/conv"
2513
+ type: "Convolution"
2514
+ bottom: "stage8_2/sum"
2515
+ top: "cls5/conv"
2516
+ param {
2517
+ lr_mult: 1.0
2518
+ decay_mult: 1.0
2519
+ }
2520
+ param {
2521
+ lr_mult: 1.0
2522
+ decay_mult: 0.0
2523
+ }
2524
+ convolution_param {
2525
+ num_output: 12
2526
+ bias_term: true
2527
+ pad: 0
2528
+ kernel_size: 1
2529
+ group: 1
2530
+ stride: 1
2531
+ weight_filler {
2532
+ type: "msra"
2533
+ }
2534
+ dilation: 1
2535
+ }
2536
+ }
2537
+ layer {
2538
+ name: "cls5/permute"
2539
+ type: "Permute"
2540
+ bottom: "cls5/conv"
2541
+ top: "cls5/permute"
2542
+ permute_param {
2543
+ order: 0
2544
+ order: 2
2545
+ order: 3
2546
+ order: 1
2547
+ }
2548
+ }
2549
+ layer {
2550
+ name: "cls5/flatten"
2551
+ type: "Flatten"
2552
+ bottom: "cls5/permute"
2553
+ top: "cls5/flatten"
2554
+ flatten_param {
2555
+ axis: 1
2556
+ }
2557
+ }
2558
+ layer {
2559
+ name: "loc5/conv"
2560
+ type: "Convolution"
2561
+ bottom: "stage8_2/sum"
2562
+ top: "loc5/conv"
2563
+ param {
2564
+ lr_mult: 1.0
2565
+ decay_mult: 1.0
2566
+ }
2567
+ param {
2568
+ lr_mult: 1.0
2569
+ decay_mult: 0.0
2570
+ }
2571
+ convolution_param {
2572
+ num_output: 24
2573
+ bias_term: true
2574
+ pad: 0
2575
+ kernel_size: 1
2576
+ group: 1
2577
+ stride: 1
2578
+ weight_filler {
2579
+ type: "msra"
2580
+ }
2581
+ dilation: 1
2582
+ }
2583
+ }
2584
+ layer {
2585
+ name: "loc5/permute"
2586
+ type: "Permute"
2587
+ bottom: "loc5/conv"
2588
+ top: "loc5/permute"
2589
+ permute_param {
2590
+ order: 0
2591
+ order: 2
2592
+ order: 3
2593
+ order: 1
2594
+ }
2595
+ }
2596
+ layer {
2597
+ name: "loc5/flatten"
2598
+ type: "Flatten"
2599
+ bottom: "loc5/permute"
2600
+ top: "loc5/flatten"
2601
+ flatten_param {
2602
+ axis: 1
2603
+ }
2604
+ }
2605
+ layer {
2606
+ name: "stage8_2/sum/prior_box"
2607
+ type: "PriorBox"
2608
+ bottom: "stage8_2/sum"
2609
+ bottom: "data"
2610
+ top: "stage8_2/sum/prior_box"
2611
+ prior_box_param {
2612
+ min_size: 300.0
2613
+ max_size: 400.0
2614
+ aspect_ratio: 2.0
2615
+ aspect_ratio: 0.5
2616
+ aspect_ratio: 3.0
2617
+ aspect_ratio: 0.3333333432674408
2618
+ flip: false
2619
+ clip: false
2620
+ variance: 0.10000000149011612
2621
+ variance: 0.10000000149011612
2622
+ variance: 0.20000000298023224
2623
+ variance: 0.20000000298023224
2624
+ step: 32.0
2625
+ }
2626
+ }
2627
+ layer {
2628
+ name: "mbox_conf"
2629
+ type: "Concat"
2630
+ bottom: "cls1/flatten"
2631
+ bottom: "cls2/flatten"
2632
+ bottom: "cls3/flatten"
2633
+ bottom: "cls4/flatten"
2634
+ bottom: "cls5/flatten"
2635
+ top: "mbox_conf"
2636
+ concat_param {
2637
+ axis: 1
2638
+ }
2639
+ }
2640
+ layer {
2641
+ name: "mbox_loc"
2642
+ type: "Concat"
2643
+ bottom: "loc1/flatten"
2644
+ bottom: "loc2/flatten"
2645
+ bottom: "loc3/flatten"
2646
+ bottom: "loc4/flatten"
2647
+ bottom: "loc5/flatten"
2648
+ top: "mbox_loc"
2649
+ concat_param {
2650
+ axis: 1
2651
+ }
2652
+ }
2653
+ layer {
2654
+ name: "mbox_priorbox"
2655
+ type: "Concat"
2656
+ bottom: "stage4_8/sum/prior_box"
2657
+ bottom: "stage5_4/sum/prior_box"
2658
+ bottom: "stage6_2/sum/prior_box"
2659
+ bottom: "stage7_2/sum/prior_box"
2660
+ bottom: "stage8_2/sum/prior_box"
2661
+ top: "mbox_priorbox"
2662
+ concat_param {
2663
+ axis: 2
2664
+ }
2665
+ }
2666
+ layer {
2667
+ name: "mbox_conf_reshape"
2668
+ type: "Reshape"
2669
+ bottom: "mbox_conf"
2670
+ top: "mbox_conf_reshape"
2671
+ reshape_param {
2672
+ shape {
2673
+ dim: 0
2674
+ dim: -1
2675
+ dim: 2
2676
+ }
2677
+ }
2678
+ }
2679
+ layer {
2680
+ name: "mbox_conf_softmax"
2681
+ type: "Softmax"
2682
+ bottom: "mbox_conf_reshape"
2683
+ top: "mbox_conf_softmax"
2684
+ softmax_param {
2685
+ axis: 2
2686
+ }
2687
+ }
2688
+ layer {
2689
+ name: "mbox_conf_flatten"
2690
+ type: "Flatten"
2691
+ bottom: "mbox_conf_softmax"
2692
+ top: "mbox_conf_flatten"
2693
+ flatten_param {
2694
+ axis: 1
2695
+ }
2696
+ }
2697
+ layer {
2698
+ name: "detection_output"
2699
+ type: "DetectionOutput"
2700
+ bottom: "mbox_loc"
2701
+ bottom: "mbox_conf_flatten"
2702
+ bottom: "mbox_priorbox"
2703
+ top: "detection_output"
2704
+ detection_output_param {
2705
+ num_classes: 2
2706
+ share_location: true
2707
+ background_label_id: 0
2708
+ nms_param {
2709
+ nms_threshold: 0.44999998807907104
2710
+ top_k: 100
2711
+ }
2712
+ code_type: CENTER_SIZE
2713
+ keep_top_k: 100
2714
+ confidence_threshold: 0.20000000298023224
2715
+ }
2716
+ }
sr.caffemodel ADDED
Binary file (23.9 kB). View file
 
sr.prototxt ADDED
@@ -0,0 +1,403 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ layer {
2
+ name: "data"
3
+ type: "Input"
4
+ top: "data"
5
+ input_param {
6
+ shape {
7
+ dim: 1
8
+ dim: 1
9
+ dim: 224
10
+ dim: 224
11
+ }
12
+ }
13
+ }
14
+ layer {
15
+ name: "conv0"
16
+ type: "Convolution"
17
+ bottom: "data"
18
+ top: "conv0"
19
+ param {
20
+ lr_mult: 1.0
21
+ decay_mult: 1.0
22
+ }
23
+ param {
24
+ lr_mult: 1.0
25
+ decay_mult: 0.0
26
+ }
27
+ convolution_param {
28
+ num_output: 32
29
+ bias_term: true
30
+ pad: 1
31
+ kernel_size: 3
32
+ group: 1
33
+ stride: 1
34
+ weight_filler {
35
+ type: "msra"
36
+ }
37
+ }
38
+ }
39
+ layer {
40
+ name: "conv0/lrelu"
41
+ type: "ReLU"
42
+ bottom: "conv0"
43
+ top: "conv0"
44
+ relu_param {
45
+ negative_slope: 0.05000000074505806
46
+ }
47
+ }
48
+ layer {
49
+ name: "db1/reduce"
50
+ type: "Convolution"
51
+ bottom: "conv0"
52
+ top: "db1/reduce"
53
+ param {
54
+ lr_mult: 1.0
55
+ decay_mult: 1.0
56
+ }
57
+ param {
58
+ lr_mult: 1.0
59
+ decay_mult: 0.0
60
+ }
61
+ convolution_param {
62
+ num_output: 8
63
+ bias_term: true
64
+ pad: 0
65
+ kernel_size: 1
66
+ group: 1
67
+ stride: 1
68
+ weight_filler {
69
+ type: "msra"
70
+ }
71
+ }
72
+ }
73
+ layer {
74
+ name: "db1/reduce/lrelu"
75
+ type: "ReLU"
76
+ bottom: "db1/reduce"
77
+ top: "db1/reduce"
78
+ relu_param {
79
+ negative_slope: 0.05000000074505806
80
+ }
81
+ }
82
+ layer {
83
+ name: "db1/3x3"
84
+ type: "Convolution"
85
+ bottom: "db1/reduce"
86
+ top: "db1/3x3"
87
+ param {
88
+ lr_mult: 1.0
89
+ decay_mult: 1.0
90
+ }
91
+ param {
92
+ lr_mult: 1.0
93
+ decay_mult: 0.0
94
+ }
95
+ convolution_param {
96
+ num_output: 8
97
+ bias_term: true
98
+ pad: 1
99
+ kernel_size: 3
100
+ group: 8
101
+ stride: 1
102
+ weight_filler {
103
+ type: "msra"
104
+ }
105
+ }
106
+ }
107
+ layer {
108
+ name: "db1/3x3/lrelu"
109
+ type: "ReLU"
110
+ bottom: "db1/3x3"
111
+ top: "db1/3x3"
112
+ relu_param {
113
+ negative_slope: 0.05000000074505806
114
+ }
115
+ }
116
+ layer {
117
+ name: "db1/1x1"
118
+ type: "Convolution"
119
+ bottom: "db1/3x3"
120
+ top: "db1/1x1"
121
+ param {
122
+ lr_mult: 1.0
123
+ decay_mult: 1.0
124
+ }
125
+ param {
126
+ lr_mult: 1.0
127
+ decay_mult: 0.0
128
+ }
129
+ convolution_param {
130
+ num_output: 32
131
+ bias_term: true
132
+ pad: 0
133
+ kernel_size: 1
134
+ group: 1
135
+ stride: 1
136
+ weight_filler {
137
+ type: "msra"
138
+ }
139
+ }
140
+ }
141
+ layer {
142
+ name: "db1/1x1/lrelu"
143
+ type: "ReLU"
144
+ bottom: "db1/1x1"
145
+ top: "db1/1x1"
146
+ relu_param {
147
+ negative_slope: 0.05000000074505806
148
+ }
149
+ }
150
+ layer {
151
+ name: "db1/concat"
152
+ type: "Concat"
153
+ bottom: "conv0"
154
+ bottom: "db1/1x1"
155
+ top: "db1/concat"
156
+ concat_param {
157
+ axis: 1
158
+ }
159
+ }
160
+ layer {
161
+ name: "db2/reduce"
162
+ type: "Convolution"
163
+ bottom: "db1/concat"
164
+ top: "db2/reduce"
165
+ param {
166
+ lr_mult: 1.0
167
+ decay_mult: 1.0
168
+ }
169
+ param {
170
+ lr_mult: 1.0
171
+ decay_mult: 0.0
172
+ }
173
+ convolution_param {
174
+ num_output: 8
175
+ bias_term: true
176
+ pad: 0
177
+ kernel_size: 1
178
+ group: 1
179
+ stride: 1
180
+ weight_filler {
181
+ type: "msra"
182
+ }
183
+ }
184
+ }
185
+ layer {
186
+ name: "db2/reduce/lrelu"
187
+ type: "ReLU"
188
+ bottom: "db2/reduce"
189
+ top: "db2/reduce"
190
+ relu_param {
191
+ negative_slope: 0.05000000074505806
192
+ }
193
+ }
194
+ layer {
195
+ name: "db2/3x3"
196
+ type: "Convolution"
197
+ bottom: "db2/reduce"
198
+ top: "db2/3x3"
199
+ param {
200
+ lr_mult: 1.0
201
+ decay_mult: 1.0
202
+ }
203
+ param {
204
+ lr_mult: 1.0
205
+ decay_mult: 0.0
206
+ }
207
+ convolution_param {
208
+ num_output: 8
209
+ bias_term: true
210
+ pad: 1
211
+ kernel_size: 3
212
+ group: 8
213
+ stride: 1
214
+ weight_filler {
215
+ type: "msra"
216
+ }
217
+ }
218
+ }
219
+ layer {
220
+ name: "db2/3x3/lrelu"
221
+ type: "ReLU"
222
+ bottom: "db2/3x3"
223
+ top: "db2/3x3"
224
+ relu_param {
225
+ negative_slope: 0.05000000074505806
226
+ }
227
+ }
228
+ layer {
229
+ name: "db2/1x1"
230
+ type: "Convolution"
231
+ bottom: "db2/3x3"
232
+ top: "db2/1x1"
233
+ param {
234
+ lr_mult: 1.0
235
+ decay_mult: 1.0
236
+ }
237
+ param {
238
+ lr_mult: 1.0
239
+ decay_mult: 0.0
240
+ }
241
+ convolution_param {
242
+ num_output: 32
243
+ bias_term: true
244
+ pad: 0
245
+ kernel_size: 1
246
+ group: 1
247
+ stride: 1
248
+ weight_filler {
249
+ type: "msra"
250
+ }
251
+ }
252
+ }
253
+ layer {
254
+ name: "db2/1x1/lrelu"
255
+ type: "ReLU"
256
+ bottom: "db2/1x1"
257
+ top: "db2/1x1"
258
+ relu_param {
259
+ negative_slope: 0.05000000074505806
260
+ }
261
+ }
262
+ layer {
263
+ name: "db2/concat"
264
+ type: "Concat"
265
+ bottom: "db1/concat"
266
+ bottom: "db2/1x1"
267
+ top: "db2/concat"
268
+ concat_param {
269
+ axis: 1
270
+ }
271
+ }
272
+ layer {
273
+ name: "upsample/reduce"
274
+ type: "Convolution"
275
+ bottom: "db2/concat"
276
+ top: "upsample/reduce"
277
+ param {
278
+ lr_mult: 1.0
279
+ decay_mult: 1.0
280
+ }
281
+ param {
282
+ lr_mult: 1.0
283
+ decay_mult: 0.0
284
+ }
285
+ convolution_param {
286
+ num_output: 32
287
+ bias_term: true
288
+ pad: 0
289
+ kernel_size: 1
290
+ group: 1
291
+ stride: 1
292
+ weight_filler {
293
+ type: "msra"
294
+ }
295
+ }
296
+ }
297
+ layer {
298
+ name: "upsample/reduce/lrelu"
299
+ type: "ReLU"
300
+ bottom: "upsample/reduce"
301
+ top: "upsample/reduce"
302
+ relu_param {
303
+ negative_slope: 0.05000000074505806
304
+ }
305
+ }
306
+ layer {
307
+ name: "upsample/deconv"
308
+ type: "Deconvolution"
309
+ bottom: "upsample/reduce"
310
+ top: "upsample/deconv"
311
+ param {
312
+ lr_mult: 1.0
313
+ decay_mult: 1.0
314
+ }
315
+ param {
316
+ lr_mult: 1.0
317
+ decay_mult: 0.0
318
+ }
319
+ convolution_param {
320
+ num_output: 32
321
+ bias_term: true
322
+ pad: 1
323
+ kernel_size: 3
324
+ group: 32
325
+ stride: 2
326
+ weight_filler {
327
+ type: "msra"
328
+ }
329
+ }
330
+ }
331
+ layer {
332
+ name: "upsample/lrelu"
333
+ type: "ReLU"
334
+ bottom: "upsample/deconv"
335
+ top: "upsample/deconv"
336
+ relu_param {
337
+ negative_slope: 0.05000000074505806
338
+ }
339
+ }
340
+ layer {
341
+ name: "upsample/rec"
342
+ type: "Convolution"
343
+ bottom: "upsample/deconv"
344
+ top: "upsample/rec"
345
+ param {
346
+ lr_mult: 1.0
347
+ decay_mult: 1.0
348
+ }
349
+ param {
350
+ lr_mult: 1.0
351
+ decay_mult: 0.0
352
+ }
353
+ convolution_param {
354
+ num_output: 1
355
+ bias_term: true
356
+ pad: 0
357
+ kernel_size: 1
358
+ group: 1
359
+ stride: 1
360
+ weight_filler {
361
+ type: "msra"
362
+ }
363
+ }
364
+ }
365
+ layer {
366
+ name: "nearest"
367
+ type: "Deconvolution"
368
+ bottom: "data"
369
+ top: "nearest"
370
+ param {
371
+ lr_mult: 0.0
372
+ decay_mult: 0.0
373
+ }
374
+ convolution_param {
375
+ num_output: 1
376
+ bias_term: false
377
+ pad: 0
378
+ kernel_size: 2
379
+ group: 1
380
+ stride: 2
381
+ weight_filler {
382
+ type: "constant"
383
+ value: 1.0
384
+ }
385
+ }
386
+ }
387
+ layer {
388
+ name: "Crop1"
389
+ type: "Crop"
390
+ bottom: "nearest"
391
+ bottom: "upsample/rec"
392
+ top: "Crop1"
393
+ }
394
+ layer {
395
+ name: "fc"
396
+ type: "Eltwise"
397
+ bottom: "Crop1"
398
+ bottom: "upsample/rec"
399
+ top: "fc"
400
+ eltwise_param {
401
+ operation: SUM
402
+ }
403
+ }