FEDERATED OPTIMIZATION IN HETEROGENEOUS NETWORKS

FEDERATED OPTIMIZATION IN HETEROGENEOUS NETWORKS

2020 | Tian Li, Anit Kumar Sahu, Manzil Zaheer, Maziar Sanjabi, Ameet Talwalkar, Virginia Smith
FedProx is a federated optimization framework designed to address the challenges of heterogeneity in federated learning. Unlike FedAvg, which assumes uniform local work and data distribution, FedProx allows for variable local work based on device-specific constraints and incorporates a proximal term to improve stability and convergence. The framework handles both statistical heterogeneity (non-identical data distributions) and systems heterogeneity (differences in device capabilities). Theoretically, FedProx provides convergence guarantees under these conditions, while empirically, it demonstrates improved stability and accuracy compared to FedAvg, particularly in highly heterogeneous settings. FedProx achieves this by allowing partial updates from stragglers and incorporating a proximal term that restricts local updates to be closer to the global model, thereby mitigating the negative effects of statistical heterogeneity. The framework is shown to be more robust and efficient in handling systems heterogeneity, with empirical results indicating a 22% improvement in test accuracy on average in highly heterogeneous environments. The analysis of FedProx considers both statistical and systems heterogeneity, and the framework is shown to be effective in a variety of synthetic and real-world federated datasets.FedProx is a federated optimization framework designed to address the challenges of heterogeneity in federated learning. Unlike FedAvg, which assumes uniform local work and data distribution, FedProx allows for variable local work based on device-specific constraints and incorporates a proximal term to improve stability and convergence. The framework handles both statistical heterogeneity (non-identical data distributions) and systems heterogeneity (differences in device capabilities). Theoretically, FedProx provides convergence guarantees under these conditions, while empirically, it demonstrates improved stability and accuracy compared to FedAvg, particularly in highly heterogeneous settings. FedProx achieves this by allowing partial updates from stragglers and incorporating a proximal term that restricts local updates to be closer to the global model, thereby mitigating the negative effects of statistical heterogeneity. The framework is shown to be more robust and efficient in handling systems heterogeneity, with empirical results indicating a 22% improvement in test accuracy on average in highly heterogeneous environments. The analysis of FedProx considers both statistical and systems heterogeneity, and the framework is shown to be effective in a variety of synthetic and real-world federated datasets.
Reach us at info@futurestudyspace.com
[slides] Federated Optimization in Heterogeneous Networks | StudySpace