Abstract

SummaryThis paper proposes a prediction‐based scaling and placement of service function chains (SFCs) to improve service level agreement (SLA) and reduce operation cost. We used a variant of recurrent neural network (RNN) called gated recurrent unit (GRU) for resource demand prediction. Then, considering these predictions, we built an intuitive scale in/out algorithm. We also developed an algorithm that applies Q‐Learning on Edge computing environment (EdgeQL) to place these scaled‐out VNFs in appropriate locations. The integrated algorithm that combines prediction, scaling, and placement are called RNN‐EdgeQL. RNN‐EdgeQL (v2) is further improved to achieve application agnostic group level elasticity in the chain, independent of applications installed on the VNFs. We tested our algorithm on two realistic temporal dynamic load models including Internet traffic (Abilene) and an application specific traffic (Wiki) on an OpenStack testbed. The contribution of this article is threefold. First, prediction model prepares the target SFC for the upcoming load. Second, an application agnostic characteristics of the algorithm achieves the group‐level elasticity in SFC. Finally, the EdgeQL placement model minimizes the end‐to‐end path of an SFC in multi‐access edge computing (MEC) environment. As a result, RNN‐EdgeQL (v2) gives the lowest overall latency, lowest SLA violations, and lowest VNFs requirement, compared to RNN‐EdgeQL (v1) and Threshold‐Openstack default placement.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.