Abstract

In hybrid LRU algorithm was built to execute parameterized priority queue using Least Recently Used model. It helped to determine the object in an optimum mode to remove from cache. Experiment results demonstrated ~30% decrease of the execution time to extract data from cache store during object cache extraction process. In the era of modern utility computing theory, Serverless architecture is the cloud platform concept to hide the server usage from the development community and runs the code on-demand basis. This paper provides Hybrid LRU algorithm by leveraging Serverless Architecture benefits. Eventually, this new technique added few advantages like infrastructure instance scalability, no server management, reduced cost on efficient usage, etc. This paper depicts about the experimental advantage of Hybrid LRU execution time optimization using Serverless architecture.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.