Abstract

Theoretical modeling is a popular method for quantitative analysis and performance prediction of computer systems, including cloud systems. Low entropy cloud (i.e., low interference among workloads and low system jitter) is becoming a new trend, where the Labeled Network Stack (LNS) based server is a good case to gain orders of magnitude performance improvement compared to servers based on traditional network stacks. However, it is desirable to figure out 1) where the low tail latency and the low entropy of LNS mainly come from, compared with mTCP, a typical user-space network stack in academia, and Linux network stack, the mainstream network stack in industry, and 2) how much LNS can be further optimized. Therefore, we propose a queueing theory-based analytical method defining a bottleneck stage to simplify the quantitative analysis of tail latency. Facilitated by the analytical method, we establish models characterizing the change of processing speed in different stages for an LNS-based server, an mTCP-based server, and a Linux-based server, with bursty traffic as an example. Under such traffic, each network service stage's processing speed is obtained by non-intrusive basic tests to identify the slowest stage as the bottleneck according to traffic and system characteristics. Our models reveal that the full-datapath prioritized processing and the full-path zero-copy are primary sources of the low tail latency and the low entropy of the LNS-based server, with 0.8%-24.4% error for the 9 9 t h percentile latency. In addition, the model of the LNS-based server can give the best number of worker threads querying a database, improving 2.1 × -3.5 × in concurrency.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.