Reduced-Complexity Deep Neural Networks Design Using Multi-Level Compression
- 1 June 2017
- journal article
- research article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Sustainable Computing
- Vol. 4 (2), 245-251
- https://doi.org/10.1109/tsusc.2017.2710178
Abstract
Deep Neural Network has achieved great success in many fields. However, many DNN models are both deep and large thereby causing high storage and energy consumption during the training and inference phases. This paper proposes multi-level compression framework. By utilizing cross-layer parameter-reducing techniques ranging from structure compression to weight compression to representation compression, the proposed compression strategy can enable order-of-magnitude reduction in network size for both training and inference with negligible accuracy loss, thereby leading to very high-efficiency and high-accuracy DNN models. Experiments show that the proposed strategy can achieve around 1.8K compression ratio in terms of dense matrices and around 30x for the overall model.Keywords
This publication has 8 references indexed in Scilit:
- ImageNet classification with deep convolutional neural networksCommunications of the ACM, 2017
- A Survey of Techniques for Approximate ComputingACM Computing Surveys, 2016
- An Exploration of Parameter Redundancy in Deep Networks with Circulant ProjectionsPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2015
- Deep learning in neural networks: An overviewNeural Networks, 2015
- DeepFace: Closing the Gap to Human-Level Performance in Face VerificationPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2014
- Deep Neural Networks for Acoustic Modeling in Speech Recognition: The Shared Views of Four Research GroupsIEEE Signal Processing Magazine, 2012
- Gradient-based learning applied to document recognitionProceedings of the IEEE, 1998
- A logical calculus of the ideas immanent in nervous activityBulletin of Mathematical Biology, 1943