Abstract

Bug question answering is an effective way to acquire the required bug information and to help bug comprehension. Many existing approaches use keyword matching techniques to obtain more bug information directly without understanding the semantic information of bug data, which make the returned results irrelevant to the input queries. To alleviate this problem, we present a novel bug question answering approach named BERT-BugQA that takes advantage of the Bidirectional Encoder Representations from Transformers (BERT) which can fully consider the bidirectional context of bug information. In special, we design a common paradigm to construct the bug reading comprehension dataset for this approach. Empirical study demonstrates that BERT-BugQA is effective to automatically obtain the answers, and the F1-score values of Mozilla and Eclipse project are 0.84 and 0.83, respectively, which are better than the state-of-the-art Q&A approaches. Index Terms-Bug question answering, BERT, Bug natural language reading comprehension.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.