Abstract

Bug question answering is an effective way to acquire the required bug information and to help bug comprehension. Many existing approaches use keyword matching techniques to obtain more bug information directly without understanding the semantic information of bug data, which make the returned results irrelevant to the input queries. To alleviate this problem, we present a novel bug question answering approach named BERT-BugQA that takes advantage of the Bidirectional Encoder Representations from Transformers (BERT) which can fully consider the bidirectional context of bug information. In special, we design a common paradigm to construct the bug reading comprehension dataset for this approach. Empirical study demonstrates that BERT-BugQA is effective to automatically obtain the answers, and the F1-score values of Mozilla and Eclipse project are 0.84 and 0.83, respectively, which are better than the state-of-the-art Q&A approaches. Index Terms-Bug question answering, BERT, Bug natural language reading comprehension.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call