Abstract

Named entity recognition (NER) plays an important role in information extraction tasks, but most models rely on large-scale labeled data. Getting the model to move away from large-scale labeled datasets is challenging. In this paper, a SCNER (Self-Supervised NER) model is proposed. The BiLSTM (Bidirectional LSTM) is adopted as the named entity extractor, and an Instruction Generation Subsystem (IGS) is proposed to generate "Retelling Instructions", which analyzes the similarities between the input instructions and "Retelling Instructions" as the losses for model training. A series of rules based on traditional learning rules have been proposed for discrete forward computation and error backpropagation. It mimics language learning in human infants and constructs a SCNER model. This model is used for robot instruction understanding and can be trained on unlabeled datasets to extract named entities from instructions. Experimental results show that the proposed model is competitive with the supervised BiLSTM-CRF and BERT-NER models. In addition, the model is applied to a real robot, which verifies the practicality of SCNER.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.