Abstract

As Empirical Software Engineering grows in maturity and number of publications, more replications are needed to provide a solid grounding to the evidence found through prior research. However, replication studies are scarce in general and some topics suffer more than others with such scarcity. On top, the challenges associated with replicating empirical studies are not well understood. In this study, we aim to fill this gap by investigating difficulties emerging when replicating an experiment. We used participants with distinct backgrounds to play the role of a research group attempting to replicate an experimental study addressing Highly-Configurable Systems. Seven external close replications in total were performed. After obtaining the quantitative replication results, a focus group session was applied to each group inquiring about the replication experience. We used the grounded theory’s constant comparison method for the qualitative analysis. We have seen in our study that, in the replications performed, most results hold when comparing them with the baseline. However, the participants reported many difficulties in replicating the original study, mostly related to the lack of clarity of the instructions and the presence of defects on replication artifacts. Based on our findings, we provide recommendations that can help mitigate the problems reported. The quality of replication artifacts and the lack of clear instructions might impact an experiment replication. We advocate having good quality replication instructions and well-prepared laboratory packages to foster and enable researchers to perform better replications.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call