loading page

Federated Learning using Distributed Messaging with Entitlements for Anonymous Computation and Secure Delivery of Model
  • +1
  • Monik Raj Behera ,
  • sudhir upadhyay ,
  • Robert Otter ,
  • Suresh Shetty
Monik Raj Behera
Author Profile
sudhir upadhyay
Author Profile
Robert Otter
Author Profile
Suresh Shetty
Author Profile

Abstract

Federated learning has become one of the most recent and widely researched areas of machine learning. Several machine-learning frameworks, such as Tensorflow Federated and PySyft and others have gained momentum in recent past and continue to evolve. Some of the frameworks involve techniques such as differential privacy, secure multi-party computation, gradient descent calculation over the network to achieve privacy of underlying data in federated learning. While these frameworks serve the need for a general-purpose federated learning model as per certain framework, in this paper we present a solution based on distributed messaging with appropriate entitlements that enterprises can leverage in a managed and permissioned network. The solution implements access controls on message source and destination in a decentralized network, which can implement any given data science model in the federated network to facilitate secure federated learning.