Abstract

Software Crowdsourcing, the act of outsourcing software development tasks to a crowd in the form of an open call, happens mediated by a platform and is based on tasks. In the competitive model, the members of the crowd seek for tasks and submit solutions attempting to receive financial rewards In this context, task description plays a relevant role since its understanding supports the choice and development of a task. Little is known about the role of task description as support for these processes. In order to contribute to fill this gap, this paper presents an empirical study exploring the role of documentation when developers select and develop tasks in software crowdsourcing. The TopCoder platform was studied in two stages: a case study with newcomers to crowdsourcing (in the classroom); and a study based on interviews with industry professionals. We identified that the documentation quality influences task selection. Tasks with unclear objective description, without specifying required technologies or environment setup instructions, discourage developers from selecting the task. We also found that poorly specified or incomplete tasks lead developers to look for supplementary material or invest more time and effort than initially estimated. The results provide a better understanding about the importance of task documentation in software crowdsourcing and point out what information is important to the crowd.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.