Retraining

A Simple Way to Improve the Ensemble Accuracy of Deep Neural Networks for Image Classification

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

In this paper, we propose a new heuristic training procedure to help a deep neural network (DNN) repeatedly escape from a local minimum and move to a better local minimum. Our method repeats the following processes multiple times: Randomly reinitializing the weights of the last layer of a converged DNN while preserving the weights of the remaining layers, and then conducting a new round of training. The motivation is to make the training in the new round learn better parameters based on the 'good' initial parameters learned in the previous round. With multiple randomly initialized DNNs trained based on our training procedure, we can obtain an ensemble of DNNs that are more accurate and diverse compared with the normal training procedure. We call this framework 'retraining'. Experiments on eight DNN models show that our method generally outperforms the state-of-the-art ensemble learning methods. We also provide two variants of the retraining framework to tackle the tasks of ensemble learning in which 1) DNNs exhibit very high training accuracies (e.g., > 95%) and 2) DNNs are too computationally expensive to train.

Original languageEnglish
Title of host publication2018 24th International Conference on Pattern Recognition, ICPR 2018
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages860-867
Number of pages8
ISBN (Electronic)9781538637883
DOIs
Publication statusPublished - Nov 26 2018
Event24th International Conference on Pattern Recognition, ICPR 2018 - Beijing, China
Duration: Aug 20 2018Aug 24 2018

Publication series

NameProceedings - International Conference on Pattern Recognition
Volume2018-August
ISSN (Print)1051-4651

Other

Other24th International Conference on Pattern Recognition, ICPR 2018
CountryChina
CityBeijing
Period8/20/188/24/18

Fingerprint

Image classification
Deep neural networks
Experiments

All Science Journal Classification (ASJC) codes

  • Computer Vision and Pattern Recognition

Cite this

Zhao, K., Matsukawa, T., & Suzuki, E. (2018). Retraining: A Simple Way to Improve the Ensemble Accuracy of Deep Neural Networks for Image Classification. In 2018 24th International Conference on Pattern Recognition, ICPR 2018 (pp. 860-867). [8545535] (Proceedings - International Conference on Pattern Recognition; Vol. 2018-August). Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/ICPR.2018.8545535

Retraining : A Simple Way to Improve the Ensemble Accuracy of Deep Neural Networks for Image Classification. / Zhao, Kaikai; Matsukawa, Tetsu; Suzuki, Einoshin.

2018 24th International Conference on Pattern Recognition, ICPR 2018. Institute of Electrical and Electronics Engineers Inc., 2018. p. 860-867 8545535 (Proceedings - International Conference on Pattern Recognition; Vol. 2018-August).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Zhao, K, Matsukawa, T & Suzuki, E 2018, Retraining: A Simple Way to Improve the Ensemble Accuracy of Deep Neural Networks for Image Classification. in 2018 24th International Conference on Pattern Recognition, ICPR 2018., 8545535, Proceedings - International Conference on Pattern Recognition, vol. 2018-August, Institute of Electrical and Electronics Engineers Inc., pp. 860-867, 24th International Conference on Pattern Recognition, ICPR 2018, Beijing, China, 8/20/18. https://doi.org/10.1109/ICPR.2018.8545535
Zhao K, Matsukawa T, Suzuki E. Retraining: A Simple Way to Improve the Ensemble Accuracy of Deep Neural Networks for Image Classification. In 2018 24th International Conference on Pattern Recognition, ICPR 2018. Institute of Electrical and Electronics Engineers Inc. 2018. p. 860-867. 8545535. (Proceedings - International Conference on Pattern Recognition). https://doi.org/10.1109/ICPR.2018.8545535
Zhao, Kaikai ; Matsukawa, Tetsu ; Suzuki, Einoshin. / Retraining : A Simple Way to Improve the Ensemble Accuracy of Deep Neural Networks for Image Classification. 2018 24th International Conference on Pattern Recognition, ICPR 2018. Institute of Electrical and Electronics Engineers Inc., 2018. pp. 860-867 (Proceedings - International Conference on Pattern Recognition).
@inproceedings{fa84757ca51b41feafd1204aee78736c,
title = "Retraining: A Simple Way to Improve the Ensemble Accuracy of Deep Neural Networks for Image Classification",
abstract = "In this paper, we propose a new heuristic training procedure to help a deep neural network (DNN) repeatedly escape from a local minimum and move to a better local minimum. Our method repeats the following processes multiple times: Randomly reinitializing the weights of the last layer of a converged DNN while preserving the weights of the remaining layers, and then conducting a new round of training. The motivation is to make the training in the new round learn better parameters based on the 'good' initial parameters learned in the previous round. With multiple randomly initialized DNNs trained based on our training procedure, we can obtain an ensemble of DNNs that are more accurate and diverse compared with the normal training procedure. We call this framework 'retraining'. Experiments on eight DNN models show that our method generally outperforms the state-of-the-art ensemble learning methods. We also provide two variants of the retraining framework to tackle the tasks of ensemble learning in which 1) DNNs exhibit very high training accuracies (e.g., > 95{\%}) and 2) DNNs are too computationally expensive to train.",
author = "Kaikai Zhao and Tetsu Matsukawa and Einoshin Suzuki",
year = "2018",
month = "11",
day = "26",
doi = "10.1109/ICPR.2018.8545535",
language = "English",
series = "Proceedings - International Conference on Pattern Recognition",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
pages = "860--867",
booktitle = "2018 24th International Conference on Pattern Recognition, ICPR 2018",
address = "United States",

}

TY - GEN

T1 - Retraining

T2 - A Simple Way to Improve the Ensemble Accuracy of Deep Neural Networks for Image Classification

AU - Zhao, Kaikai

AU - Matsukawa, Tetsu

AU - Suzuki, Einoshin

PY - 2018/11/26

Y1 - 2018/11/26

N2 - In this paper, we propose a new heuristic training procedure to help a deep neural network (DNN) repeatedly escape from a local minimum and move to a better local minimum. Our method repeats the following processes multiple times: Randomly reinitializing the weights of the last layer of a converged DNN while preserving the weights of the remaining layers, and then conducting a new round of training. The motivation is to make the training in the new round learn better parameters based on the 'good' initial parameters learned in the previous round. With multiple randomly initialized DNNs trained based on our training procedure, we can obtain an ensemble of DNNs that are more accurate and diverse compared with the normal training procedure. We call this framework 'retraining'. Experiments on eight DNN models show that our method generally outperforms the state-of-the-art ensemble learning methods. We also provide two variants of the retraining framework to tackle the tasks of ensemble learning in which 1) DNNs exhibit very high training accuracies (e.g., > 95%) and 2) DNNs are too computationally expensive to train.

AB - In this paper, we propose a new heuristic training procedure to help a deep neural network (DNN) repeatedly escape from a local minimum and move to a better local minimum. Our method repeats the following processes multiple times: Randomly reinitializing the weights of the last layer of a converged DNN while preserving the weights of the remaining layers, and then conducting a new round of training. The motivation is to make the training in the new round learn better parameters based on the 'good' initial parameters learned in the previous round. With multiple randomly initialized DNNs trained based on our training procedure, we can obtain an ensemble of DNNs that are more accurate and diverse compared with the normal training procedure. We call this framework 'retraining'. Experiments on eight DNN models show that our method generally outperforms the state-of-the-art ensemble learning methods. We also provide two variants of the retraining framework to tackle the tasks of ensemble learning in which 1) DNNs exhibit very high training accuracies (e.g., > 95%) and 2) DNNs are too computationally expensive to train.

UR - http://www.scopus.com/inward/record.url?scp=85059759484&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85059759484&partnerID=8YFLogxK

U2 - 10.1109/ICPR.2018.8545535

DO - 10.1109/ICPR.2018.8545535

M3 - Conference contribution

T3 - Proceedings - International Conference on Pattern Recognition

SP - 860

EP - 867

BT - 2018 24th International Conference on Pattern Recognition, ICPR 2018

PB - Institute of Electrical and Electronics Engineers Inc.

ER -