Toronto Metropolitan University
Browse

File(s) not publicly available

Task Offloading and Resource Allocation in Vehicular Networks: A Lyapunov Based Deep Reinforcement Learning Approach

journal contribution
posted on 2024-11-21, 01:53 authored by Anitha Saravana Kumar, Xavier FernandoXavier Fernando, Lian ZhaoLian Zhao

 Vehicular Edge Computing (VEC) has gained popularity due to its ability to enhance vehicular networks. VEC servers located at Roadside Units (RSUs) allow low-power vehicles to offload computation-intensive and delay-sensitive applications, making it a promising solution. However, optimal resource allocation between edge servers is a complex issue due to vehicle mobility and dynamic data traffic. To address this issue, we propose a Lyapunov-based Multi-Agent Deep Deterministic Policy Gradient (L-MADDPG) method that jointly optimizes computing task distribution and radio resource allocation to minimize energy consumption and delay requirements. We evaluate the trade-offs between the performance of the optimization algorithm, queuing model, and energy consumption. We first examine delay, queue and energy models for task execution at the vehicle or RSU, followed by the L-MADDPG algorithm for jointly optimizing task offloading and resource allocation problems to reduce energy consumption without compromising performance. Our simulation results show that our algorithm can reduce energy consumption while maintaining system performance compared to existing algorithms. 

History

Language

English

Usage metrics

    Toronto Metropolitan University

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC