8 May 2024 | Chunxu Zhang, Guodong Long, Hongkuan Guo, Xiao Fang, Yang Song, Zhaojie Liu, Guorui Zhou, Zijian Zhang, Yang Liu, Bo Yang
The paper introduces a novel federated adaptation mechanism, named Federated Recommendation with Personalized Adapter (FedPA), to enhance foundation model-based recommendation systems while preserving user privacy. Each client learns a lightweight personalized adapter using their private data, which collaborates with pre-trained foundation models to provide efficient and fine-grained recommendation services. The method ensures that users' private behavioral data remains secure and is not shared with the server, adhering to a data localization-based privacy preservation approach through the federated learning framework. The model ensures that shared knowledge is incorporated into all adapters while preserving each user's personal preferences. Experimental results on four benchmark datasets demonstrate the superior performance of FedPA compared to advanced baselines. The implementation code is available for reproducibility. The main contributions include the first investigation of federated adaptation for foundation model-based recommendations, the development of a personalized low-rank adapter for efficient user personalization modeling, and the design of an adaptive gate learning mechanism for effective knowledge fusion. The method also addresses the challenge of deploying models on resource-constrained devices by distilling a compact model from the pre-trained model and enhances privacy protection by integrating Local Differential Privacy.The paper introduces a novel federated adaptation mechanism, named Federated Recommendation with Personalized Adapter (FedPA), to enhance foundation model-based recommendation systems while preserving user privacy. Each client learns a lightweight personalized adapter using their private data, which collaborates with pre-trained foundation models to provide efficient and fine-grained recommendation services. The method ensures that users' private behavioral data remains secure and is not shared with the server, adhering to a data localization-based privacy preservation approach through the federated learning framework. The model ensures that shared knowledge is incorporated into all adapters while preserving each user's personal preferences. Experimental results on four benchmark datasets demonstrate the superior performance of FedPA compared to advanced baselines. The implementation code is available for reproducibility. The main contributions include the first investigation of federated adaptation for foundation model-based recommendations, the development of a personalized low-rank adapter for efficient user personalization modeling, and the design of an adaptive gate learning mechanism for effective knowledge fusion. The method also addresses the challenge of deploying models on resource-constrained devices by distilling a compact model from the pre-trained model and enhances privacy protection by integrating Local Differential Privacy.