Abstract

Requirements classification is considered a crucial task in requirements engineering. The analysis of functional and Non-functional requirements (NFRs) requires domain knowledge. NFRs are considered quality attributes, that hold critical information about the constraints upon which the success of software depends. Usually, requirements are expressed in natural language, and extraction and classification of such requirements become a challenging task. Various automatic techniques for requirements classification are exploited by existing studies. Recently, Transfer learning as a new deep learning model attracts the attention of researchers, which excelled in different Natural Language Processing tasks. Transfer learning-based BERT pre-trained model achieved more promising results than state-of-the-art approaches. This proposed research presents a Bidirectional Encoder-Decoder Transformer-Convolutional Neural Network (BERT-CNN) model for requirements classification. Then, the convolutional layer is stacked over the BERT layer for performance enhancement. This research work conducted an experiment on the PROMISE dataset of 625 requirements. The experiment results demonstrate that the proposed model performs better than the state-of-the-art baseline approach.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call