Papers
arxiv:1603.05027

Identity Mappings in Deep Residual Networks

Published on Mar 16, 2016
Authors:
,
,
,

Abstract

Deep residual networks have emerged as a family of extremely deep architectures showing compelling accuracy and nice convergence behaviors. In this paper, we analyze the propagation formulations behind the residual building blocks, which suggest that the forward and backward signals can be directly propagated from one block to any other block, when using identity mappings as the skip connections and after-addition activation. A series of ablation experiments support the importance of these identity mappings. This motivates us to propose a new residual unit, which makes training easier and improves generalization. We report improved results using a 1001-layer ResNet on CIFAR-10 (4.62% error) and CIFAR-100, and a 200-layer ResNet on ImageNet. Code is available at: https://github.com/KaimingHe/resnet-1k-layers

Community

Sign up or log in to comment

Models citing this paper 22

Browse 22 models citing this paper

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/1603.05027 in a dataset README.md to link it from this page.

Spaces citing this paper 2

Collections including this paper 3