Please use this identifier to cite or link to this item:
Title: Crowdsourcing in Software Development: Empirical Support for Configuring Contests
Authors: Bibi, Stamatia
Zozas, Ioannis
Ampatzoglou, Apostolos
Sarigiannidis, Panagiotis
Kalampokis, George
Stamelos, Ioannis
Type: Article
Subjects: FRASCATI::Natural sciences::Computer and information sciences
Keywords: crowdsourcing
software development
success factors
crowd factors
Issue Date: 2020
Source: IEEE Access
First Page: 1
Last Page: 25
Abstract: Despite the extensive adoption of crowdsourcing for the timely, cost-effective, and highquality completion of software development tasks, a large number of crowdsourced challenges are not able to acquire a winning solution, on time, and within the desired cost and quality thresholds. A possible reason for this is that we currently lack a systematic approach that would aid software managers during the process of designing software development tasks that will be crowdsourced. This paper attempts to extend the current knowledge on designing crowdsourced software development tasks, by empirically answering the following management questions: (a) what type of projects should be crowdsourced; (b) why should one crowdsource—in terms of acquired benefits; (c) where should one crowdsource—in terms of application domain; (d) when to crowdsource—referring to the time period of the year; (e) who will win or participate in the contest; and (f) how to crowdsource (define contest duration, prize, type of contest etc.) to acquire the maximum benefits—depending on the goal of crowdsourcing. To answer the aforementioned questions, we have performed a case study on 2,209 software development tasks crowdsourced through TopCoder platform. The results suggest that there are significant differences in the level to which crowdsourcing goals are reached, across different software development activities. Based on this observation we suggest that software managers should prioritize the goals of crowdsourcing, decide carefully upon the activity to be crowdsourced and then define the settings of the task.
ISSN: 2169-3536
Other Identifiers: 10.1109/ACCESS.2020.2982619
Appears in Collections:Department of Applied Informatics

Files in This Item:
File Description SizeFormat 
IEEEAcess_Bibi_etal.pdf2,97 MBAdobe PDFView/Open

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.