SecAgg.pdf (2.02 MB)
Download fileFSSA: Efficient 3-Round Secure Aggregation for Privacy-Preserving Federated Learning
Federated learning (FL) allows a large number of clients to collaboratively train machine learning (ML) models by sending only their local gradients to a central server for aggregation in each training iteration, without sending their raw training data. This paper proposes a 3-round secure aggregation protocol, that is effificient in terms of computation and communication, and resilient to client dropouts.