This paper explores the integration of Federated Learning (FL) with Edge Computing (EC) to address challenges in data privacy, security, and efficient model training. Edge computing decentralizes data processing closer to the source, reducing latency and improving performance. However, transmitting large volumes of data to centralized servers poses communication, reliability, and security challenges. FL offers a solution by enabling decentralized model training without sharing raw data, thus preserving privacy and reducing data transmission. FL allows multiple edge devices to collaboratively train a model while keeping local data on-device, enhancing privacy and enabling efficient, secure collaboration.
FL is particularly beneficial in EC environments where data is generated by diverse nodes such as vehicles, healthcare facilities, and IoT devices. It enables the optimization of models through multi-peer interactions, ensuring data freshness and reducing the need for data migration. Despite its potential, FL implementation in EC contexts remains underexplored, with limited comprehensive evaluations of its challenges and solutions.
The paper reviews existing literature on FL in EC, highlighting key research areas, challenges, and future directions. It discusses the fundamentals of EC and FL, their integration, and the technical and practical considerations for their implementation. The study also addresses hardware requirements for FL in EC, emphasizing the need for efficient processors and accelerators to support the computational demands of deep learning models.
Applications of FL in EC include computation offloading, content caching, and malware detection. The paper also explores the broader implications of FL in social marketing and its potential to foster interdisciplinary collaboration between technology, education, and foreign language studies. The research aims to provide a systematic review of FL in EC, identifying key challenges, opportunities, and future research directions to advance the field.This paper explores the integration of Federated Learning (FL) with Edge Computing (EC) to address challenges in data privacy, security, and efficient model training. Edge computing decentralizes data processing closer to the source, reducing latency and improving performance. However, transmitting large volumes of data to centralized servers poses communication, reliability, and security challenges. FL offers a solution by enabling decentralized model training without sharing raw data, thus preserving privacy and reducing data transmission. FL allows multiple edge devices to collaboratively train a model while keeping local data on-device, enhancing privacy and enabling efficient, secure collaboration.
FL is particularly beneficial in EC environments where data is generated by diverse nodes such as vehicles, healthcare facilities, and IoT devices. It enables the optimization of models through multi-peer interactions, ensuring data freshness and reducing the need for data migration. Despite its potential, FL implementation in EC contexts remains underexplored, with limited comprehensive evaluations of its challenges and solutions.
The paper reviews existing literature on FL in EC, highlighting key research areas, challenges, and future directions. It discusses the fundamentals of EC and FL, their integration, and the technical and practical considerations for their implementation. The study also addresses hardware requirements for FL in EC, emphasizing the need for efficient processors and accelerators to support the computational demands of deep learning models.
Applications of FL in EC include computation offloading, content caching, and malware detection. The paper also explores the broader implications of FL in social marketing and its potential to foster interdisciplinary collaboration between technology, education, and foreign language studies. The research aims to provide a systematic review of FL in EC, identifying key challenges, opportunities, and future research directions to advance the field.