Abstract

Crowdsourced testing is an emerging trend in mobile application testing. The openness of crowdsourced testing provides a promising way to conduct large-scale and user-oriented testing scenarios on various mobile devices, while it also brings a problem, i.e., crowdworkers with different levels of testing experience severely threaten the quality of crowdsourced testing. Currently, many approaches have been proposed and studied to improve crowdsourced testing. However, these approaches do not fundamentally improve the ability of crowdworkers. In essence, the low-quality crowdsourced testing is caused by crowdworkers who are unfamiliar with the App Under Test (AUT) and do not know which part of the AUT should be tested. To address this problem, we propose a testing assistance approach, which leverages Android automated testing (i.e., dynamic and static analysis) to improve crowdsourced testing. Our approach constructs an Annotated Window Transition Graph (AWTG) model for the AUT by merging dynamic and static analysis results. Based on the AWTG model, our approach implements a testing assistance pipeline that provides the test task extraction, test task recommendation, and test task guidance to assist crowdworkers in testing the AUT. We experimentally evaluate our approach on real-world AUTs. The quantitative results demonstrate that our approach can effectively and efficiently assist crowdsourced testing. Besides, the qualitative results from a user study confirm the usefulness of our approach.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call