Abstract

We introduce the concept of cognitive cloud offloading where all viable wireless interfaces of a multiple radio enabled device are used for computation offloading. We propose a time and wireless adaptive heuristic for offloading computationally intensive applications to a remote cloud with goals of reducing the energy consumption on the mobile device, execution time of the application, and efficient use of the multiple radio interfaces available at the device. The proposed algorithms simultaneously determine: 1) execution place of each application component (mobile/cloud); 2) amount of the associated data to be sent via each available interface of the multiple radio access technology device; and 3) scheduling order of the application components. We define a net utility function that trades off mobile device resources (battery, CPU, and memory) with realtime communication costs, such as latency and communication energy, subject to constraints that ensure queue stability of radio interfaces. Simulations using real data from an HTC smartphone running multi-component applications with Amazon EC2 as the cloud, and two radios, LTE and WiFi, show that cognitive cloud offloading provides higher net utility in comparison to the best-interface protocol. Scalability of the proposed heuristic is further analyzed using various levels for component dependency graphs and energy-delay trade-off factors.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call