15 Apr 2024 | Kai Yi, Nidham Gazagnadou, Peter Richtárik, Lingjuan Lyu
FedP3 is a federated learning framework designed to address model heterogeneity while ensuring privacy. The framework incorporates both global and local pruning strategies to optimize communication efficiency and model personalization. FedP3 allows for customized model personalization based on each client's unique constraints, such as computational power, memory, and network bandwidth. It also introduces a privacy-friendly approach by limiting the data shared with the server to only the necessary layers after local training. The framework effectively manages data and model diversity, supporting non-iid data distributions and various client-model architectures. Theoretical analysis of FedP3 and its locally differential-private variant, DP-FedP3, is provided, along with convergence theories, communication costs, and comparisons with existing methodologies. The framework also introduces a local differential privacy algorithm, LDP-FedP3, which provides privacy guarantees, utility, and communication efficiency. Experimental results demonstrate the effectiveness of FedP3 in reducing communication costs and maintaining performance across different datasets and pruning strategies. The framework is evaluated on benchmark datasets such as CIFAR10/100, EMNIST, and FashionMNIST, showing significant improvements in communication efficiency and performance. Theoretical analysis and experimental results validate the effectiveness of FedP3 in federated learning scenarios with model heterogeneity and privacy constraints.FedP3 is a federated learning framework designed to address model heterogeneity while ensuring privacy. The framework incorporates both global and local pruning strategies to optimize communication efficiency and model personalization. FedP3 allows for customized model personalization based on each client's unique constraints, such as computational power, memory, and network bandwidth. It also introduces a privacy-friendly approach by limiting the data shared with the server to only the necessary layers after local training. The framework effectively manages data and model diversity, supporting non-iid data distributions and various client-model architectures. Theoretical analysis of FedP3 and its locally differential-private variant, DP-FedP3, is provided, along with convergence theories, communication costs, and comparisons with existing methodologies. The framework also introduces a local differential privacy algorithm, LDP-FedP3, which provides privacy guarantees, utility, and communication efficiency. Experimental results demonstrate the effectiveness of FedP3 in reducing communication costs and maintaining performance across different datasets and pruning strategies. The framework is evaluated on benchmark datasets such as CIFAR10/100, EMNIST, and FashionMNIST, showing significant improvements in communication efficiency and performance. Theoretical analysis and experimental results validate the effectiveness of FedP3 in federated learning scenarios with model heterogeneity and privacy constraints.