Communication Efficient Privacy-Preserving Federated Learning via Knowledge Distillation for Human Activity Recognition Systems
Emerging Internet of Things (IoT) applications, such as sensor-based Human Activity Recognition (HAR) systems, require efficient machine learning solutions due to their resource- constrained nature which raises the need to design heterogeneous model architectures. Federated Learning (FL) has been used to train distributed deep learning models. However, standard federated learning (fedAvg) does not allow the training of het-erogeneous models. Our work addresses the model and statistical heterogeneities of distributed HAR systems. We propose a Federated Learning via Augmented Knowledge Distillation (FedAKD) algorithm for heterogeneous HAR systems and evaluate it on a self-collected sensor-based HAR dataset. Then, Kullback-Leibler (KL) divergence loss is compared with Mean Squared Error (MSE) loss for the Knowledge Distillation (KD) mechanism. Our experiments demonstrate that MSE contributes to a better KD loss than KL. Experiments show that FedAKD is communication-efficient compared with model-dependent FL algorithms and outperforms other KD-based FL methods under the i.i.d. and non-i.i.d. scenarios.
Email Address of Submitting Authorggad@lakeheadu.ca
ORCID of Submitting Author0000-0001-9177-9950
Submitting Author's InstitutionLakehead university
Submitting Author's Country