Japanese / English

Detail of Publication

Text Language Japanese
Authors Yoshihiro Yamada, Masakazu Iwamura, Koichi Kise
Title ShakeDrop Regularization
Journal 6th International Conference on Learning Representation (ICLR) Workshop
Pages pp.1-4
Number of Pages 4 pages
Reviewed or not Reviewed
Month & Year April 2018
Abstract This paper proposes a powerful regularization method named ShakeDrop regularization. ShakeDrop is inspired by Shake-Shake regularization that decreases error rates by disturbing learning. While Shake-Shake can be applied to only ResNeXt which has multiple branches, ShakeDrop can be applied to not only ResNeXt but also ResNet, Wide ResNet and PyramidNet in a memory efficient way. Important and interesting feature of ShakeDrop is that it strongly disturbs learning by multiplying even a negative factor to the output of a convolutional layer in the forward training pass. The effectiveness of ShakeDrop is confirmed by experiments on CIFAR-10/100 and Tiny ImageNet datasets.
Back to list