15 Apr 2024 | Kai Yi, Nidham Gazagnadou, Peter Richtárik, Lingjuan Lyu
FedP3 is a federated learning framework designed to address model heterogeneity and enhance privacy. The framework incorporates both global and local pruning strategies, allowing for personalized model customization based on client-specific constraints. FedP3 enables clients to train on a subset of the global model, reducing communication costs while maintaining performance. The framework also introduces a privacy-friendly approach by transmitting only selected layers of the global model to the server, thereby protecting the model's structure. Theoretical analysis validates the efficiency of FedP3 and its privacy-preserving variant, DP-FedP3. Experiments on benchmark datasets such as CIFAR10, CIFAR100, and FashionMNIST demonstrate that FedP3 achieves significant communication cost reductions without compromising performance. The framework also explores various pruning strategies, including uniform and ordered dropout, and evaluates different aggregation methods for model updates. Results show that FedP3 outperforms existing methods in terms of communication efficiency and performance, particularly in non-iid data distributions. The framework's adaptability and effectiveness in handling model heterogeneity make it a promising solution for federated learning applications.FedP3 is a federated learning framework designed to address model heterogeneity and enhance privacy. The framework incorporates both global and local pruning strategies, allowing for personalized model customization based on client-specific constraints. FedP3 enables clients to train on a subset of the global model, reducing communication costs while maintaining performance. The framework also introduces a privacy-friendly approach by transmitting only selected layers of the global model to the server, thereby protecting the model's structure. Theoretical analysis validates the efficiency of FedP3 and its privacy-preserving variant, DP-FedP3. Experiments on benchmark datasets such as CIFAR10, CIFAR100, and FashionMNIST demonstrate that FedP3 achieves significant communication cost reductions without compromising performance. The framework also explores various pruning strategies, including uniform and ordered dropout, and evaluates different aggregation methods for model updates. Results show that FedP3 outperforms existing methods in terms of communication efficiency and performance, particularly in non-iid data distributions. The framework's adaptability and effectiveness in handling model heterogeneity make it a promising solution for federated learning applications.