Li Shen, Zhouchen Lin, Qingming Huang
Learning deeper convolutional neural networks becomes a tendency in recent years. However, many empirical evidences suggest that performance improvement cannot be gained by simply stacking more layers. In this paper, we consider the issue from an information theoretical perspective, and propose a novel method Relay Backpropagation, that encourages the propagation of effective information through the network in training stage. By virtue of the method, we achieved the first place in ILSVRC 2015 Scene Classification Challenge. Extensive experiments on two challenging large scale datasets demonstrate the effectiveness of our method is not restricted to a specific dataset or network architecture. Our models will be available to the research community later.
| Task | Dataset | Metric | Value | Model |
|---|---|---|---|---|
| Image Classification | COCO-MLT | Average mAP | 46.97 | RS(ResNet-50) |
| Image Classification | VOC-MLT | Average mAP | 75.38 | RS(ResNet-50) |
| Few-Shot Image Classification | COCO-MLT | Average mAP | 46.97 | RS(ResNet-50) |
| Few-Shot Image Classification | VOC-MLT | Average mAP | 75.38 | RS(ResNet-50) |
| Generalized Few-Shot Classification | COCO-MLT | Average mAP | 46.97 | RS(ResNet-50) |
| Generalized Few-Shot Classification | VOC-MLT | Average mAP | 75.38 | RS(ResNet-50) |
| Long-tail Learning | COCO-MLT | Average mAP | 46.97 | RS(ResNet-50) |
| Long-tail Learning | VOC-MLT | Average mAP | 75.38 | RS(ResNet-50) |
| Generalized Few-Shot Learning | COCO-MLT | Average mAP | 46.97 | RS(ResNet-50) |
| Generalized Few-Shot Learning | VOC-MLT | Average mAP | 75.38 | RS(ResNet-50) |