2024 | MEKALA, M.S., DHIMAN, G., SRIVASTAV, G., NAIN, Z., ZHANG, H., VIRIYASITAVAT, W. and VARMA, G.P.S.
This paper proposes a two-step deep reinforcement learning (DRL)-based service offloading (DSO) approach using a directed acyclic graph (DAG) for edge computational orchestration. The DSO approach aims to reduce edge server costs by analyzing resource and service execution time (SET) through a DRL-influenced resource and SET analysis (RSA) model. The first level of the approach considers service and edge server costs during service offloading, while the second level evaluates resource factors to optimize resource sharing and SET fluctuations using the R-retaliation method. The simulation results show that the proposed DSO approach achieves low execution costs by streamlining dynamic service completion and transmission time, server cost, and deadline violation rate attributes. Compared to state-of-the-art approaches, the proposed method achieves high resource usage with low energy consumption. The DSO approach is designed to optimize service offloading by reducing offloading time, execution cost, adaptive resource utilization, and waiting queue length. The approach uses a DRL-based SO (DSO) method to measure resource provision (RP) rate of arrived services through the current status of edge servers (ESs) based on prognostic service execution time (PSET) method. The contributions of this work include developing a DSO approach to reduce subservice execution time (SET) cost, transmission time, and optimize energy usage rate; developing an R-retaliation analysis model to optimize the RP rate, service deadline violation rate, and SET fluctuations based on prognostic big-data evaluation factors; developing a prognosticate execution cost method to regulate the service execution time fluctuations; and developing adaptive methods to evaluate service request transmission time, energy preservation method, and other factors. The proposed DSO approach is evaluated through simulation results that demonstrate its effectiveness in reducing execution costs, improving resource usage, and minimizing energy consumption. The results show that the proposed DSO approach outperforms existing methods in terms of service reliability, resource utilization, and energy efficiency. The paper also discusses the time complexity of the proposed algorithms, which is essential for optimizing the objective. The DSO approach is shown to be effective in reducing service execution delay, optimizing resource allocation, and improving service reliability in edge computing environments.This paper proposes a two-step deep reinforcement learning (DRL)-based service offloading (DSO) approach using a directed acyclic graph (DAG) for edge computational orchestration. The DSO approach aims to reduce edge server costs by analyzing resource and service execution time (SET) through a DRL-influenced resource and SET analysis (RSA) model. The first level of the approach considers service and edge server costs during service offloading, while the second level evaluates resource factors to optimize resource sharing and SET fluctuations using the R-retaliation method. The simulation results show that the proposed DSO approach achieves low execution costs by streamlining dynamic service completion and transmission time, server cost, and deadline violation rate attributes. Compared to state-of-the-art approaches, the proposed method achieves high resource usage with low energy consumption. The DSO approach is designed to optimize service offloading by reducing offloading time, execution cost, adaptive resource utilization, and waiting queue length. The approach uses a DRL-based SO (DSO) method to measure resource provision (RP) rate of arrived services through the current status of edge servers (ESs) based on prognostic service execution time (PSET) method. The contributions of this work include developing a DSO approach to reduce subservice execution time (SET) cost, transmission time, and optimize energy usage rate; developing an R-retaliation analysis model to optimize the RP rate, service deadline violation rate, and SET fluctuations based on prognostic big-data evaluation factors; developing a prognosticate execution cost method to regulate the service execution time fluctuations; and developing adaptive methods to evaluate service request transmission time, energy preservation method, and other factors. The proposed DSO approach is evaluated through simulation results that demonstrate its effectiveness in reducing execution costs, improving resource usage, and minimizing energy consumption. The results show that the proposed DSO approach outperforms existing methods in terms of service reliability, resource utilization, and energy efficiency. The paper also discusses the time complexity of the proposed algorithms, which is essential for optimizing the objective. The DSO approach is shown to be effective in reducing service execution delay, optimizing resource allocation, and improving service reliability in edge computing environments.