TechRxiv
Federated Learning using Distributed Messaging with Entitlements for Anonymous Computation and Secure Delivery of Model.pdf (1.06 MB)
Download file

Federated Learning using Distributed Messaging with Entitlements for Anonymous Computation and Secure Delivery of Model

Download (1.06 MB)
preprint
posted on 2020-12-04, 09:16 authored by Monik Raj BeheraMonik Raj Behera, sudhir upadhyaysudhir upadhyay, Robert Otter, Suresh Shetty
Federated learning has become one of the most recent and widely researched areas of machine learning. Several machine-learning frameworks, such as Tensorflow Federated and PySyft and others have gained momentum in recent past and continue to evolve. Some of the frameworks involve techniques such as differential privacy, secure multi-party computation, gradient descent calculation over the network to achieve privacy of underlying data in federated learning. While these frameworks serve the need for a general-purpose federated learning model as per certain framework, in this paper we present a solution based on distributed messaging with appropriate entitlements that enterprises can leverage in a managed and permissioned network. The solution implements access controls on message source and destination in a decentralized network, which can implement any given data science model in the federated network to facilitate secure federated learning.

History

Email Address of Submitting Author

monik.r.behera@jpmorgan.com

ORCID of Submitting Author

https://orcid.org/0000-0001-9385-2533

Submitting Author's Institution

JPMorgan Chase & Co

Submitting Author's Country

  • India