In Internet applications, network conversation is the primary communication between the user and server. The server needs to efficiently and quickly return the corresponding service according to the conversation sent by the user to improve the users’ Quality of Service. Thus, Conversation Information Seeking (CIS) research has become a hot topic today. In Cloud Computing (CC), a central service mode, the conversation is transmitted between the user and the remote cloud over a long distance. With the explosive growth of Internet applications, network congestion, long-distance communication, and single point of failure have brought new challenges to the centralized service mode. People put forward Edge Cloud Computing (ECC) to meet the new challenges of the centralized service mode of CC. As a distributed service mode, ECC is an extension of CC. By migrating services from the remote cloud to the network edge closer to users, ECC can solve the above challenges in CC well. In ECC, people solve the problem of CIS through edge caching. The current research focuses on designing the edge cache strategy to achieve more predictable caching. In this article, we propose an edge cache placement method Evolutionary Game based Caching Placement Strategy (EG-CPS). This method consists of three modules: the user preference prediction module, the content popularity calculation module, and the cache placement decision module. To maximize the predictability of the cache strategy, we are committed to optimizing the cache hit rate and service latency. The simulation experiment compares the proposed strategy with several other cache strategies. The experimental results illustrate that EG-CPS can reduce up to 2.4% of the original average content request latency, increase the average direct cache hit rate by 1.7%, and increase the average edge cache hit rate by 3.3%.
Read full abstract