Personalized Edge Intelligence via Federated Self-Knowledge Distillation
Open Access
- 28 November 2022
- journal article
- research article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Parallel and Distributed Systems
- Vol. 34 (2), 567-580
- https://doi.org/10.1109/tpds.2022.3225185
Abstract
Federated Learning (FL) is an emerging approach in edge computing for collaboratively training machine learning models among multiple devices, which aims to address limited bandwidth, system heterogeneity, and privacy issues in traditional centralized training. However, the existing federated learning methods focus on learning a shared global model for all devices, which may not always be ideal for different devices. Such situations become even worse when each edge device has its own data distribution or task. In this paper, we study personalized federated learning in which our goal is to train models to perform well for individual clients. We observe that the initialization in each communication round causes the forgetting of historical personalized knowledge. Based on this observation, we propose a novel Personalized Federated Learning (PFL) framework via self-knowledge distillation, named pFedSD. By allowing clients to distill the knowledge of previous personalized models to current local models, pFedSD accelerates the process of recalling the personalized knowledge for the latest initialized clients. Moreover, self-knowledge distillation provides different views of data in feature space to realize an implicit ensemble of local models. Extensive experiments on various datasets and settings demonstrate the effectiveness and robustness of pFedSD.Keywords
Funding Information
- National Natural Science Foundation of China (62072204, 2020kfyXJJS019)
- Fundamental Research Funds for the Central Universities
This publication has 15 references indexed in Scilit:
- Multi-Task Federated Learning for Personalised Deep Neural Networks in Edge ComputingIEEE Transactions on Parallel and Distributed Systems, 2021
- Decentralized Edge Intelligence: A Dynamic Resource Allocation Framework for Hierarchical Federated LearningIEEE Transactions on Parallel and Distributed Systems, 2021
- Federated Learning for Healthcare InformaticsJournal of Healthcare Informatics Research, 2020
- Performance Optimization of Federated Person Re-identification via Benchmark AnalysisPublished by Association for Computing Machinery (ACM) ,2020
- Self-Balancing Federated Learning With Global Imbalanced Data in Mobile SystemsIEEE Transactions on Parallel and Distributed Systems, 2020
- FedED: Federated Learning via Ensemble Distillation for Medical Relation ExtractionPublished by Association for Computational Linguistics (ACL) ,2020
- Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self DistillationPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2019
- Deep Mutual LearningPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2018
- Overcoming catastrophic forgetting in neural networksProceedings of the National Academy of Sciences of the United States of America, 2017
- Deep Residual Learning for Image RecognitionPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2016