Abstract

AbstractCrowdsourcing is an appealing concept for achieving good enough requirements and just‐in‐time requirements engineering (RE). A promising form of crowdsourcing in RE is the use of feedback on software systems, generated through a large network of anonymous users of these systems over a period of time. Prior research indicated implicit and explicit user feedback as key to RE‐practitioners to discover new and changed requirements and decide on software features to add, enhance, or abandon. However, a structured account on the types and characteristics of user feedback useful for RE purposes is still lacking. This research fills the gap by providing a mapping study of literature on crowdsourced user feedback employed for RE purposes. On the basis of the analysis of 44 selected papers, we found nine pieces of metadata that characterized crowdsourced user feedback and that were employed in seven specific RE activities. We also found that the published research has a strong focus on crowd‐generated comments (explicit feedback) to be used for RE purposes, rather than employing application logs or usage‐generated data (implicit feedback). Our findings suggest a need to broaden the scope of research effort in order to leverage the benefits of both explicit and implicit feedback in RE.

Highlights

  • Crowd‐based requirements engineering (RE) is the practice of large‐scale user involvement in RE activities

  • This subsection presents the distribution of explicit and implicit user feedback that has been employed in RE, according to the origins of those two types of crowdsourced feedback

  • On the basis of 44 selected publications, this mapping study provided an overview on the types of user feedback that have been employed for crowdsourced RE activities

Read more

Summary

Introduction

Crowd‐based requirements engineering (RE) is the practice of large‐scale user involvement in RE activities. Users are unknown volunteers and massive in number. Their involvement can take a variety of forms. To IT‐consulting companies and software development organizations, this opportunity for large‐scale user involvement means a way to get good‐enough requirements and to achieve just‐in‐ time RE, which implies a significant potential to reduce the cost of RE processes. One promising form of crowdsourcing in RE—that is exploited by businesses and that attracts much research attention—is the use of feedback volunteered by large networks of anonymous users of software systems over a period of time in an RE activity, such as elicitation, validation, or prioritization.[1] So far, prior research suggests that implicit and

Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call