Communication_efficient_Privacy_Preserving_Federated_Learning_via_Knowledge_Distillation_for_Human_Activity_Recognition_systems.pdf (802.38 kB)

Communication Efficient Privacy-Preserving Federated Learning via Knowledge Distillation for Human Activity Recognition Systems

Download (802.38 kB)
posted on 2023-06-16, 16:39 authored by Gad GadGad Gad, zubair fadlullah, Khaled RabieKhaled Rabie, Mostafa M. Fouda

Emerging Internet of Things (IoT) applications, such as sensor-based Human Activity Recognition (HAR) systems, require efficient machine learning solutions due to their resource- constrained nature which raises the need to design heterogeneous model architectures. Federated Learning (FL) has been used to train distributed deep learning models. However, standard federated learning (fedAvg) does not allow the training of het-erogeneous models. Our work addresses the model and statistical heterogeneities of distributed HAR systems. We propose a Federated Learning via Augmented Knowledge Distillation (FedAKD) algorithm for heterogeneous HAR systems and evaluate it on a self-collected sensor-based HAR dataset. Then, Kullback-Leibler (KL) divergence loss is compared with Mean Squared Error (MSE) loss for the Knowledge Distillation (KD) mechanism. Our experiments demonstrate that MSE contributes to a better KD loss than KL. Experiments show that FedAKD is communication-efficient compared with model-dependent FL algorithms and outperforms other KD-based FL methods under the i.i.d. and non-i.i.d. scenarios.


Email Address of Submitting Author

ORCID of Submitting Author


Submitting Author's Institution

Lakehead university

Submitting Author's Country

  • Egypt