Abstract

Web based commercial systems are increasingly becoming feature rich, interactive and functional as locally installed applications. Testing web applications is unique, as many factors affect the system performance and user experience. Crowdsourcing is an appealing and economic solution to web application testing due to the ability to reach a larger international audience. However, less is known about the quality control of crowdsourced testing to harness the collective efforts of individuals. In our study, the collaborative testing problem in a crowdsourcing environment is defined as a job assignment problem and is formulated as an integer linear programming (ILP) problem. The objective of this paper is to validate a greedy job assignment approach as a tool for the effective use of crowdsourced testing. We carried out a case study on Xturk, a prototype crowdsourced testing system, to understand the crowdsourced testers behaviour that the trustworthiness, the execution time of test cases and accuracy of feedback. Several experiments indicate that this approach is comparatively effective with regards to the feasibility verdict, efficiency and accuracy.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call