Edge computing plays an essential role in the vehicle-to-infrastructure (V2I) networks, where vehicles offload their intensive computation tasks to the road-side units for saving energy and reduce the latency. This paper designs the optimal task offloading policy to address the concerns involving processing delay, energy consumption and edge computing cost. Each computation task consisting of some interdependent sub-tasks is characterized as a directed acyclic graph (DAG). In such dynamic networks, a novel hierarchical Offloading scheme is proposed by leveraging deep reinforcement learning (DRL). The inter-dependencies among the DAGs of the computation tasks are extracted using a graph neural network with attention mechanism. A parameterized DRL algorithm is developed to deal with the hierarchical action space containing both discrete and continuous actions. Simulation results with a real-world car speed dataset demonstrate that the proposed scheme can effectively reduce the system overhead.
翻译:边缘计算在车对基础设施(V2I)网络中发挥着关键作用,车辆可将密集型计算任务卸载至路侧单元以节省能耗并降低延迟。本文设计了最优任务卸载策略,以解决处理时延、能量消耗及边缘计算成本等问题。每个由若干相互依赖的子任务构成的计算任务被建模为有向无环图(DAG)。在此类动态网络中,提出了一种基于深度强化学习(DRL)的新型分层卸载方案。利用含注意力机制的图神经网络提取计算任务DAG间的内在依赖关系,并开发了参数化DRL算法以处理包含离散与连续动作的分层动作空间。基于真实车速数据集的仿真结果表明,该方案能有效降低系统开销。