Abstract

Green computing has recently evolved as promising solution to support renewable energy source (energy harvesting (EH)) to meet the paradigm of energy constraint heterogeneous Internet of Things (IoT) network. Mobile edge computing (MEC) network supports the complex computation task at the edge of IoT devices through optimal offloading rate and transmission rate. We propose reinforcement leaning (RL) Dyna-Q architecture based scheme which incorporate both EH and MEC to recharge the battery and reduce the computation latency of delay-sensitive IoT node respectively. First RL agent selects MEC server and offloading rate according to previous history of transmission rate of each MEC servers, channel gain between IoT node and MEC server, priority of computation task, predicted amount of harvested energy and battery level of IoT node based on Q-learning in multichannel heterogonous network. Further to improve the learning rate of Q-learning, we proposed Dyna Q architecture based scheme coupled with Post-decision state (PDS) learning approach, in which real experiences is used to accelerate convergence rate. Finally, simulation part show that the presented scheme outperforms in the area of convergence rate, offloading rate and energy consumption of network.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.