Convergence of Edge Computing and Deep Learning: A Comprehensive Survey

Convergence of Edge Computing and Deep Learning: A Comprehensive Survey

28 Jan 2020 | Xiaofei Wang, Yiwen Han, Victor C.M. Leung, Dusit Niyato, Xueqiang Yan, Xu Chen
This paper provides a comprehensive survey on the convergence of edge computing and deep learning (Edge DL), discussing their applications, implementation methods, challenges, and future trends. Edge computing, which brings computation closer to data sources, is increasingly being integrated with deep learning to address the limitations of cloud computing, such as high latency, data transmission costs, and privacy concerns. Edge DL aims to deploy deep learning services at the edge to enable low-latency, reliable, and efficient intelligent services, while also supporting dynamic and adaptive edge maintenance and management through deep learning techniques. The paper identifies five essential technologies for Edge DL: 1) DL applications on edge, 2) DL inference on edge, 3) edge computing for DL, 4) DL training on edge, and 5) DL for optimizing edge. These technologies are discussed in detail, highlighting their roles in enabling Edge DL. The paper also discusses the benefits of combining edge computing with deep learning, such as reducing latency, improving service response, and enhancing data privacy. It emphasizes the importance of integrating deep learning with edge computing to achieve pervasive and fine-grained intelligence, enabling the vision of "providing AI for every person and every organization at everywhere." The paper also explores various aspects of edge computing, including its paradigms, hardware, virtualization techniques, and network slicing. It discusses the fundamentals of deep learning, including neural networks, deep reinforcement learning, and distributed DL training. The paper highlights the importance of efficient and lightweight DL libraries for edge computing, as well as the challenges of deploying DL models on edge devices, such as resource constraints and the need for model optimization. The paper presents various edge DL applications, including real-time video analytics, autonomous Internet of Vehicles (IoVs), intelligent manufacturing, and smart home and city systems. It discusses the benefits of deploying DL on the edge, such as improved performance, reduced latency, and enhanced privacy. The paper also addresses the challenges of deploying DL on edge devices, such as the need for model optimization, efficient inference, and resource management. Overall, the paper provides a comprehensive overview of the convergence of edge computing and deep learning, highlighting their potential to transform various applications and services.This paper provides a comprehensive survey on the convergence of edge computing and deep learning (Edge DL), discussing their applications, implementation methods, challenges, and future trends. Edge computing, which brings computation closer to data sources, is increasingly being integrated with deep learning to address the limitations of cloud computing, such as high latency, data transmission costs, and privacy concerns. Edge DL aims to deploy deep learning services at the edge to enable low-latency, reliable, and efficient intelligent services, while also supporting dynamic and adaptive edge maintenance and management through deep learning techniques. The paper identifies five essential technologies for Edge DL: 1) DL applications on edge, 2) DL inference on edge, 3) edge computing for DL, 4) DL training on edge, and 5) DL for optimizing edge. These technologies are discussed in detail, highlighting their roles in enabling Edge DL. The paper also discusses the benefits of combining edge computing with deep learning, such as reducing latency, improving service response, and enhancing data privacy. It emphasizes the importance of integrating deep learning with edge computing to achieve pervasive and fine-grained intelligence, enabling the vision of "providing AI for every person and every organization at everywhere." The paper also explores various aspects of edge computing, including its paradigms, hardware, virtualization techniques, and network slicing. It discusses the fundamentals of deep learning, including neural networks, deep reinforcement learning, and distributed DL training. The paper highlights the importance of efficient and lightweight DL libraries for edge computing, as well as the challenges of deploying DL models on edge devices, such as resource constraints and the need for model optimization. The paper presents various edge DL applications, including real-time video analytics, autonomous Internet of Vehicles (IoVs), intelligent manufacturing, and smart home and city systems. It discusses the benefits of deploying DL on the edge, such as improved performance, reduced latency, and enhanced privacy. The paper also addresses the challenges of deploying DL on edge devices, such as the need for model optimization, efficient inference, and resource management. Overall, the paper provides a comprehensive overview of the convergence of edge computing and deep learning, highlighting their potential to transform various applications and services.
Reach us at info@study.space
[slides] Convergence of Edge Computing and Deep Learning%3A A Comprehensive Survey | StudySpace