TY - GEN
T1 - Faster AutoAugment
T2 - 16th European Conference on Computer Vision, ECCV 2020
AU - Hataya, Ryuichiro
AU - Zdenek, Jan
AU - Yoshizoe, Kazuki
AU - Nakayama, Hideki
N1 - Funding Information:
Acknowledgement. The research results were achieved as a part of the “Research and Development of Deep Learning Technology for Advanced Multilingual Speech Translation”, the Commissioned Research of the National Institute of Information and Communications Technology, JAPAN. This work was also supported by JSPS KAKENHI Grant Numbers JP19H04166, JP19K22861 and JP20H04251. We used the RAIDEN system for the experiments.
Funding Information:
The research results were achieved as a part of the “Research and Development of Deep Learning Technology for Advanced Multilingual Speech Translation”, the Commissioned Research of the National Institute of Information and Communications Technology, JAPAN. This work was also supported by JSPS KAKENHI Grant Numbers JP19H04166, JP19K22861 and JP20H04251. We used the RAIDEN system for the experiments.
Publisher Copyright:
© 2020, Springer Nature Switzerland AG.
PY - 2020
Y1 - 2020
N2 - Data augmentation methods are indispensable heuristics to boost the performance of deep neural networks, especially in image recognition tasks. Recently, several studies have shown that augmentation strategies found by search algorithms outperform hand-made strategies. Such methods employ black-box search algorithms over image transformations with continuous or discrete parameters and require a long time to obtain better strategies. In this paper, we propose a differentiable policy search pipeline for data augmentation, which is much faster than previous methods. We introduce approximate gradients for several transformation operations with discrete parameters as well as a differentiable mechanism for selecting operations. As the objective of training, we minimize the distance between the distributions of augmented and original data, which can be differentiated. We show that our method, Faster AutoAugment, achieves significantly faster searching than prior methods without a performance drop.
AB - Data augmentation methods are indispensable heuristics to boost the performance of deep neural networks, especially in image recognition tasks. Recently, several studies have shown that augmentation strategies found by search algorithms outperform hand-made strategies. Such methods employ black-box search algorithms over image transformations with continuous or discrete parameters and require a long time to obtain better strategies. In this paper, we propose a differentiable policy search pipeline for data augmentation, which is much faster than previous methods. We introduce approximate gradients for several transformation operations with discrete parameters as well as a differentiable mechanism for selecting operations. As the objective of training, we minimize the distance between the distributions of augmented and original data, which can be differentiated. We show that our method, Faster AutoAugment, achieves significantly faster searching than prior methods without a performance drop.
UR - http://www.scopus.com/inward/record.url?scp=85097398331&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85097398331&partnerID=8YFLogxK
U2 - 10.1007/978-3-030-58595-2_1
DO - 10.1007/978-3-030-58595-2_1
M3 - Conference contribution
AN - SCOPUS:85097398331
SN - 9783030585945
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 1
EP - 16
BT - Computer Vision – ECCV 2020 - 16th European Conference, 2020, Proceedings
A2 - Vedaldi, Andrea
A2 - Bischof, Horst
A2 - Brox, Thomas
A2 - Frahm, Jan-Michael
PB - Springer Science and Business Media Deutschland GmbH
Y2 - 23 August 2020 through 28 August 2020
ER -