Multi-access-Mobile Edge Computing (MEC) is a promising solution for computationally demanding rigorous applications, that can meet 6G network service requirements. However, edge servers incur high computation costs during task processing. In this paper, we proposed a technique to minimize the total computation and communication overhead for optimal resource utilization with joint computational offloading that enables a green environment. Our optimization problem is NP-hard; thus, we proposed a decentralized Reinforcement Learning (dRL) approach where we eliminate the problem of dimensionality and over-estimation of the value functions. Compared to baseline schemes our technique achieves a 37.03% reduction in total system costs.
View on arXiv