Abstract

In the age of big data, contents of internet are accessed frequently from the cloud by different users, which brings heavy load pressure on network servers of service provider (SP). Dozens of researches show that this problem can be effectively solved by caching contents on the network edge. However, edge caching still faces some challenges. First, cache capacity requirements and dilemma. Most existing edge devices have small storage capacity, which can be scaled by non-volatile memories (NVMs) that are deployed in large number, but NVMs have the disadvantages of read-write asymmetry and limited writes. Second, diverse and varied user favors for content also affect the effectiveness of edge caching. Internet data explosive growth changes user preference from simple Zipf distribution to stretched exponential distribution (SED) with weakened long tail effect. The change of user’s preference affects the content placement of edge cache device, making the NVM wear worse. Third, in the actual scenario of Internet services, different Internet users have different service level requirements (i.e. differentiated services). The quality of services depends on their requirements for content, and SP needs differentiated services for users. To solve the above challenges, aiming at improving SP revenue, we propose an auction mechanism, which can offer differentiated services. Compared with the baseline which mainly optimalize SP revenue without taking NVM, differentiated services and varied user’s preference into consideration, extensive simulations verified that our proposal not only improved user’s experience and SP revenue up to 103.90% and 99.33% respectively, but also suppressed the NVM wear times similar to the baseline even when user preference distribution changes.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call