Clustered Federated Learning: Model-Agnostic Distributed Multi-Task Optimization under Privacy Constraints

Clustered Federated Learning: Model-Agnostic Distributed Multi-Task Optimization under Privacy Constraints

4 Oct 2019 | Felix Sattler, Klaus-Robert Müller*, Member, IEEE, and Wojciech Samek*, Member, IEEE
The paper introduces Clustered Federated Learning (CFL), a novel framework for Federated Multi-Task Learning (FMTL) that addresses the issue of suboptimal results when local clients' data distributions diverge. CFL groups clients into clusters with jointly trainable data distributions, leveraging geometric properties of the Federated Learning (FL) loss surface. Unlike existing FMTL approaches, CFL does not require modifications to the FL communication protocol, is applicable to general non-convex objectives, and provides strong mathematical guarantees on clustering quality. The method is flexible, handling varying client populations and implementing privacy-preserving techniques. Experiments on deep convolutional and recurrent neural networks demonstrate that CFL can achieve significantly better performance than conventional FL by allowing clients to learn more specialized models. The paper also discusses related work, implementation considerations, and future research directions.The paper introduces Clustered Federated Learning (CFL), a novel framework for Federated Multi-Task Learning (FMTL) that addresses the issue of suboptimal results when local clients' data distributions diverge. CFL groups clients into clusters with jointly trainable data distributions, leveraging geometric properties of the Federated Learning (FL) loss surface. Unlike existing FMTL approaches, CFL does not require modifications to the FL communication protocol, is applicable to general non-convex objectives, and provides strong mathematical guarantees on clustering quality. The method is flexible, handling varying client populations and implementing privacy-preserving techniques. Experiments on deep convolutional and recurrent neural networks demonstrate that CFL can achieve significantly better performance than conventional FL by allowing clients to learn more specialized models. The paper also discusses related work, implementation considerations, and future research directions.
Reach us at info@study.space
[slides] Clustered Federated Learning%3A Model-Agnostic Distributed Multitask Optimization Under Privacy Constraints | StudySpace