29 May 2024 | Ziqing Fan, Ruipeng Zhang, Jiangchao Yao, Bo Han, Ya Zhang, Yanfeng Wang
This paper proposes FedGELA, a novel federated learning approach that addresses the challenges of partially class-disjoint data (PCDD) in both personalized federated learning (P-FL) and generic federated learning (G-FL). PCDD occurs when each client contributes only a subset of classes, leading to conflicts between local and global objectives, resulting in angle collapse and space waste. FedGELA leverages the properties of simplex equiangular tight frames (ETF) to maintain a balanced and efficient classifier structure. The global classifier is fixed as a simplex ETF, while local models are adapted to personal distributions, allowing efficient use of available space and improving both global and local performance. The method is theoretically analyzed and experimentally validated on various datasets, showing significant improvements over existing methods. FedGELA achieves better performance in both global and local tasks, with averaged improvements of 3.9% over FedAvg and 1.5% over best baselines. The method is also effective in real-world applications, demonstrating robustness to practical scenarios. The paper provides a comprehensive analysis of the method's performance under different conditions and highlights its effectiveness in addressing the challenges of PCDD in federated learning.This paper proposes FedGELA, a novel federated learning approach that addresses the challenges of partially class-disjoint data (PCDD) in both personalized federated learning (P-FL) and generic federated learning (G-FL). PCDD occurs when each client contributes only a subset of classes, leading to conflicts between local and global objectives, resulting in angle collapse and space waste. FedGELA leverages the properties of simplex equiangular tight frames (ETF) to maintain a balanced and efficient classifier structure. The global classifier is fixed as a simplex ETF, while local models are adapted to personal distributions, allowing efficient use of available space and improving both global and local performance. The method is theoretically analyzed and experimentally validated on various datasets, showing significant improvements over existing methods. FedGELA achieves better performance in both global and local tasks, with averaged improvements of 3.9% over FedAvg and 1.5% over best baselines. The method is also effective in real-world applications, demonstrating robustness to practical scenarios. The paper provides a comprehensive analysis of the method's performance under different conditions and highlights its effectiveness in addressing the challenges of PCDD in federated learning.