Abstract

Cloud service providers have a QoS monitoring capability integrated in their cloud platforms. This is to aid in monitoring the performance of the platform as well as for Service Level Agreement confirmation to the clients. Unfortunately this arrangement serves the interest of the cloud provider more than the cloud client since the service providers gauge their services using their own tools. This paper performs a comparative study on the capabilities of the client based vendor neutral QoS tool, developed from a vendor neutral QoS monitoring model against the cloud provider integrated QoS monitoring tools. The comparison was done on four global SaaS cloud service providers, namely SalesForce, Google, Hubspot and Shopify. From the comparative study, it emerged that the client based vendor neutral tool has more capabilities than the cloud provider integrated tools since it has the capability to monitor three key QoS metrics, namely service response time, service availability and service stability as opposed to the cloud providers’ tools which only have one quantitative capability. Further the vendor neutral model can be used across any cloud platform that is accessible via the web browser. This provides a capability for cross platform performance comparison for the various cloud providers. This can aid in decision making with regards to which cloud service provider to procure based on the desired performance.

Highlights

  • Cloud computing involves delivery of hardware and software resources and services to users over the Internet [1]

  • The Quality of Platform (QoP) consists of: Transparency Location-aware capability, SLA management, Portability and Data auditing; The Quality of the Application (QoA) consists of Multitenancy, Configuration, Interoperability and Software fault tolerance while the Quality of Experience focuses on Service availability, Usability, Performance and Response timeliness

  • This experiment focused on the quality of experience (QoE) metrics and the metrics were defined as: the service response time is the average time it took for the user specified service to be initialized and ready for use; Availability was measured, as the number of instances the user request for a service and gets the service against the number of instances the requested service is not available, While service stability was computed using standard deviation

Read more

Summary

INTRODUCTION

Cloud computing involves delivery of hardware and software resources and services to users over the Internet [1]. According to [4] there are reservations regarding security, privacy and trust that deter the adoption of cloud computing in spite of several beneficial features This further hints to a need for a neutral cloud QoS monitoring framework, especially one that is client centric. According to [7], firms worry whether cloud computing solutions have sufficient availability and as such proposes use of multiple cloud providers for redundancy This introduces the need for a cross platform tool that can monitor and measure across different cloud platforms for comparison purposes. The auto provisioning feature, based on dynamic user needs, praised as a strength in cloud computing, poses a performance bottleneck since sometimes demand increases very rapidly and resources are not available, resulting delay in the services, or non availability of services during the peak load [14]. To be able to gauge and validate the credibility of the vendor neutral model, this paper performs a comparative study of the vendor neutral tool against the cloud service providers’ tools

RELATED WORK
RESEARCH DESIGN
Experimentation with Existing Cloud Provider’s Platforms
Gsuite
Hubspot
SalesForce
Shopify
CONCLUSION
COMPARISON BETWEEN THE VENDOR NEUTRAL MODEL AND CLOUD VENDOR SPECIFIC TOOLS
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call