TY - GEN
T1 - Efficient Incremental Training for Deep Convolutional Neural Networks
AU - Tao, Yudong
AU - Tu, Yuexuan
AU - Shyu, Mei Ling
N1 - Publisher Copyright:
© 2019 IEEE.
Copyright:
Copyright 2019 Elsevier B.V., All rights reserved.
PY - 2019/4/22
Y1 - 2019/4/22
N2 - While the deep convolutional neural networks (DCNNs) have shown excellent performance in various applications, such as image classification, training a DCNN model from scratch is computationally expensive and time consuming. In recent years, a lot of studies have been done to accelerate the training of DCNNs, but most of them were performed in a one-time manner. Considering the learning patterns of the human beings, people typically feel more comfortable to learn things in an incremental way and may be overwhelmed when absorbing a large amount of new information at once. Therefore, we demonstrate a new training schema that splits the whole training process into several sub-training steps. In this study, we propose an efficient DCNN training framework where we learn the new classes of concepts incrementally. The experiments are conducted on CIFAR-100 with VGG-19 as the backbone network. Our proposed framework demonstrates a comparable accuracy compared with the model trained from scratch and has shown 1.42x faster training speed.
AB - While the deep convolutional neural networks (DCNNs) have shown excellent performance in various applications, such as image classification, training a DCNN model from scratch is computationally expensive and time consuming. In recent years, a lot of studies have been done to accelerate the training of DCNNs, but most of them were performed in a one-time manner. Considering the learning patterns of the human beings, people typically feel more comfortable to learn things in an incremental way and may be overwhelmed when absorbing a large amount of new information at once. Therefore, we demonstrate a new training schema that splits the whole training process into several sub-training steps. In this study, we propose an efficient DCNN training framework where we learn the new classes of concepts incrementally. The experiments are conducted on CIFAR-100 with VGG-19 as the backbone network. Our proposed framework demonstrates a comparable accuracy compared with the model trained from scratch and has shown 1.42x faster training speed.
KW - Deep Convolutional Neural Network (DCNN)
KW - Efficient Model Training
KW - Incremental Model Training
UR - http://www.scopus.com/inward/record.url?scp=85065604368&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85065604368&partnerID=8YFLogxK
U2 - 10.1109/MIPR.2019.00058
DO - 10.1109/MIPR.2019.00058
M3 - Conference contribution
AN - SCOPUS:85065604368
T3 - Proceedings - 2nd International Conference on Multimedia Information Processing and Retrieval, MIPR 2019
SP - 286
EP - 291
BT - Proceedings - 2nd International Conference on Multimedia Information Processing and Retrieval, MIPR 2019
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2nd International Conference on Multimedia Information Processing and Retrieval, MIPR 2019
Y2 - 28 March 2019 through 30 March 2019
ER -