TC3KD: Knowledge distillation via teacher-student cooperative curriculum customization
- 1 October 2022
- journal article
- research article
- Published by Elsevier BV in Neurocomputing
- Vol. 508, 284-292
- https://doi.org/10.1016/j.neucom.2022.07.055
Abstract
No abstract availableKeywords
This publication has 30 references indexed in Scilit:
- Learning from Multiple Teacher NetworksPublished by Association for Computing Machinery (ACM) ,2017
- A Gift from Knowledge Distillation: Fast Optimization, Network Minimization and Transfer LearningPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2017
- Densely Connected Convolutional NetworksPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2017
- Curriculum Learning for Facial Expression RecognitionPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2017
- Deep Residual Learning for Image RecognitionPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2016
- How Hard Can It Be? Estimating the Difficulty of Visual Search in an ImagePublished by Institute of Electrical and Electronics Engineers (IEEE) ,2016
- Webly Supervised Learning of Convolutional NetworksPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2015
- Easy Samples FirstPublished by Association for Computing Machinery (ACM) ,2014
- Speeding up Convolutional Neural Networks with Low Rank ExpansionsPublished by British Machine Vision Association and Society for Pattern Recognition ,2014
- On the effectiveness of self-paced learningJournal of Memory and Language, 2011