Abstract

With the development of Big Data technology and Internet, the surge of data in the network will cause network congestion and untimely task processing. Additionally, caching content in the core network may cause redundant access of content and backhaul bottlenecks. Due to the increasing requirements of users for task processing efficiency, the centralized maintenance system based on traditional cloud computing cannot meet the current computing requirements. In view of these problems, we propose a content collaborative caching mechanism based on joint decision of download delay and energy consumption. By integrating network coding and content caching technology, the work content maintained in the communication network is deployed near the edge of the network in the form of coding to reduce the redundant transmission of content and acquisition time of content. This article establishes a user QoE satisfaction model, which consists of two indexes that measure time delay and energy consumption. This article proposes a <inline-formula><tex-math notation="LaTeX">$\varepsilon$</tex-math></inline-formula> -hybrid Q-learning algorithm to optimize the placement of cache files, and made the cache action selection based on the combination of improved heuristic greedy algorithm and simulated annealing algorithm. The experimental results show that the proposed cache strategy can reduce the delay of users downloading content and the energy consumption of content cache, so as to improve the quality of field maintenance work in communication network.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.