Security-of-FL.pdf (236.86 kB)

On the Security of Privacy-Preserving Federated Learning via Functional Encryption

Download (236.86 kB)
posted on 2023-04-14, 18:58 authored by Fucai LuoFucai Luo, Haiyan WangHaiyan Wang

Federated learning (FL) is a promising collaborative machine learning (ML) framework allowing models to be trained on sensitive real-world data while preserving its privacy.  Unfortunately, recent attacks on FL demonstrate that local gradients provided by participating users may leak information about their local training data.  To address this privacy issue, the notion of secure aggregation was proposed and has been achieved by different approaches including secure multiparty computation (MPC),  homomorphic encryption (HE), functional encryption (FE), and double-masking.

In 2023, Chang et al. proposed a novel privacy-preserving federated learning (PPFL) based on FE and claimed that their PPFL resolves the aforementioned privacy issue.  However, in this paper, we demonstrate that their PPFL is vulnerable to two attacks we design, and therefore their PPFL is insecure due to the above privacy issue.  We hope that by identifying the security problem, similar errors can be avoided in future designs of PPFL.


Email Address of Submitting Author

Submitting Author's Institution

Peng Cheng Laboratory

Submitting Author's Country

  • China