Abstract

Software defect prediction, which locates defective code snippets, can assist developers in finding potential bugs and assigning their testing efforts. Traditional defect prediction features are static code metrics, which only contain statistic information of programs and fail to capture semantics in programs, leading to the degradation of defect prediction performance. To take full advantage of the semantics and static metrics of programs, we propose a framework called Defect Prediction via Attention Mechanism (DP-AM) in this paper. Specifically, DPAM first extracts vectors which are then encoded as digital vectors by mapping and word embedding from abstract syntax trees (ASTs) of programs. Then it feeds these numerical vectors into Recurrent Neural Network to automatically learn semantic features of programs. After that, it applies self-attention mechanism to further build relationship among these features. Furthermore, it employs global attention mechanism to generate significant features among them. Finally, we combine these semantic features with traditional static metrics for accurate software defect prediction. We evaluate our method in terms of F1-measure on seven open-source Java projects in Apache. Our experimental results show that DP-AM improves F1-measure by 11% in average, compared with the state-of-the-art methods.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.