• 대한전기학회
Mobile QR Code QR CODE : The Transactions of the Korean Institute of Electrical Engineers
  • COPE
  • kcse
  • 한국과학기술단체총연합회
  • 한국학술지인용색인
  • Scopus
  • crossref
  • orcid
Title Comparison of Structure Reduction, Pruning, and Knowledge Distillation for Lightning of Deep Learning
Authors 서기성(Kisung Seo)
DOI https://doi.org/10.5370/KIEE.2021.70.12.1934
Page pp.1934-1939
ISSN 1975-8359
Keywords Deep learning; Structure Reduction; Pruning; Knowledge Distillation; CIFAR10/100. ResNet56/110
Abstract We compare three approaches of structure reduction, pruning, and knowledge distillation for lightning of a deep learning network.
Structure reduction eliminates a set of layers of the model, but pruning deletes filters within a layer. Knowledge distillation effectively learns a small student model from a large teacher model using KL Divergence. Therefore, it has a similar effect of reduction of the model. The above three methods for lightning are rarely compared to each other in terms of performance. To compare these approaches for network reduction problem, we investigate the accuracy and flops of the methods on CIFAR10 and CIFAR100 data for ResNet models. A systematic analysis for the fundamental orientations and differences of each method is supplemented.