Abstract

Recently, an ad-hoc mobile cloud has emerged as a promising architecture, which allows a mobile user to offload its computation tasks to nearby mobile devices, namely mobile cloudlets, with a short offloading latency and low bandwidth consumption of wireless local area network connections. However, due to the uncertainty in the user and cloudlet movements and the availability of cloudlets' computation resources, it is a challenge to design an effective offloading scheme that can best provide the required quality of service (QoS) for the user. In this paper, we develop a constrained Markov decision process (CMDP) formulation for the mobile user to make an optimal offloading decision while considering its QoS requirements. The objective of the proposed CMDP formulation is to maximize the user's utility obtained by the task execution, while meeting the constraints on the required payment, energy consumption, processing delay and task loss probability. Two different schemes, which are linear programming (LP)-based and Q-Learning (QL)-based CMDP, are proposed to find the optimal solution for the formulated CMDP offloading problem. The LP-based scheme can be used for the user to acquire a QoS-aware optimal offloading policy with prior knowledge of the system state transition probabilities, while the QL-based scheme enables the user to learn an effective offloading decision in an unknown ad-hoc mobile cloud system. Extensive simulations were performed to evaluate the performance of the proposed schemes. The simulation results show the effectiveness of the offloading policies obtained by the proposed schemes, which outperform baseline schemes.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call