loading page

Communication Efficient Privacy-Preserving Federated Learning via Knowledge Distillation for Human Activity Recognition Systems
  • +1
  • Gad Gad ,
  • zubair fadlullah ,
  • Khaled Rabie ,
  • Mostafa M. Fouda
Gad Gad
Lakehead university

Corresponding Author:[email protected]

Author Profile
zubair fadlullah
Author Profile
Khaled Rabie
Author Profile
Mostafa M. Fouda
Author Profile

Abstract

Emerging Internet of Things (IoT) applications, such as sensor-based Human Activity Recognition (HAR) systems, require efficient machine learning solutions due to their resource- constrained nature which raises the need to design heterogeneous model architectures. Federated Learning (FL) has been used to train distributed deep learning models. However, standard federated learning (fedAvg) does not allow the training of het-erogeneous models. Our work addresses the model and statistical heterogeneities of distributed HAR systems. We propose a Federated Learning via Augmented Knowledge Distillation (FedAKD) algorithm for heterogeneous HAR systems and evaluate it on a self-collected sensor-based HAR dataset. Then, Kullback-Leibler (KL) divergence loss is compared with Mean Squared Error (MSE) loss for the Knowledge Distillation (KD) mechanism. Our experiments demonstrate that MSE contributes to a better KD loss than KL. Experiments show that FedAKD is communication-efficient compared with model-dependent FL algorithms and outperforms other KD-based FL methods under the i.i.d. and non-i.i.d. scenarios.