Abstract
End to end testing is critical for ensuring the quality of a software product. Popular software products are deployed by customers in a wide variety of configurations over a broad range of hardware. Testing efforts, being subject to normal business constraints (limited time and resources) do not find it possible to recreate all customer scenarios in-house. In such a situation, it is valuable to understand how to model a wide range of customer scenarios so that their shared elements and unique traits can be distinguished from each other. Given such a modeling approach, it is necessary to know how to build a testing infrastructure that can utilize traditional functional tests, but reorient them in a way that provides high customer scenario coverage at optimal cost. In this paper, we model customer scenarios from the perspective of the services they enable. Services typically have multiple categories of users, with each category expecting the service to provide functionality catering to its unique requirements. This setup is modeled in our approach by using the 'Persona -- Experience - Usecase' hierarchy. Persona refers to the different categories of users that utilize a service. Experience refers to the user experiences that each user category expects from the service. Usecases clarify the specific requirements of the distinct functionality expected by each user category. Using such a persona-experience-usecase hierarchy to model customer scenarios helps demarcate requirements common to multiple scenarios from those that are specific to certain deployments. We create the scenario test model to test the corresponding experiences and usecases for different personas. The test model consists of topologies and schemas. A test topology helps capture the uniqueness associated with each scenario. It is further split into 2 parts viz. roles (that capture the various software settings associated with the customer scenario) and configurations (that capture the hardware setup, a.k.a. machine-network-storage combination associated with the customer scenario). A test schema captures the test activities related to the customer scenarios. It consists of 'action groups', with each action group being associated with a usecase for the customer scenario. Thus the 'Persona -- Experience -- Usecase' hierarchy in customer scenario modeling broadly corresponds to the 'Test Scenario - Topology, TestSchema -- Roles, Configurations, Action Groups' hierarchy in the test object model. Multi-functional, service oriented customer scenarios require that core experiences associated with each persona continue to be met during the different states of the service. We have used the concept of 'Service Level Agreements (SLAs)' to capture such core experiences to validate their continued satisfaction. Most real-world customer deployments are susceptible to hardware and software faults. We have explicitly added faults (both hardware and software) to our object model to test service resiliency under such conditions. In a customer scenario, SLAs and faults are typically associated with usecases. We have used a similar association in the test object model to compose action groups from 3 sources viz. user actions, faults and user SLAs. We implemented the proposed approach in the development cycle for Windows Server 2012. The implementation manifested as a testing infrastructure with four major components namely: deployment creator, test scheduler, monitoring system and reporting dashboard. A flexible deployment creator configured different test topologies (i.e. fileserver, high availability clustering, networking and storage variations). Action groups consisted of traditional functional tests, specially written SLA tests and simulated hardware, software faults. A smart stochastic scheduler coordinated these action groups and was used to simulate persona-experience patterns. A monitoring system gathered 2 distinct types of information, viz.: action group success status and system health indicators. Finally a rich reporting dashboard was used to overlay this information in a continuous time graph for quality assurance and issue triage. The above testing infrastructure enabled us to find issues that had been missed in traditional functional and stress tests. We targeted multiple customer scenarios and found a good mix of issues spanning different features in the software. These results increase our confidence in this approach and we propose to evolve it further going forward.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.