Abstract

AbstractMembrane computing is a type of parallel computing system (generally called P system) abstracted from information exchange mechanisms in biological cells, tissues, or neurons, which can process data in a distributed and interpretable manner. LSTM-SNP, the first model of long short-term memory networks based on parameterized nonlinear Spiking neural P systems, was proposed recently. However, a systematic understanding and leveraging of the LSTM-SNP model to address named entity recognition (NER) and other natural language processing (NLP) tasks are still lacking. The bottleneck of the NER task lies in the scarcity of data and the vague definition of entity edges. Most approaches center on dataset handling, and there have been few attempts to address the issue in Spiking neural P (SNP) systems. This paper proposes a model named CLSTM-SNP based on the LSTM-SNP, aiming to tackle the NER problem in the field of SNP systems for the first time. First, this study employs a CNN layer to obtain character-level characteristics. Second, GloVe word vectors are utilized as word representations. Third, the research employs the LSTM-SNP to analyze textual features. We subsequently studied CLSTM-SNP’s effectiveness in addressing NER problems on CoNLL-2003 and OntoNotes 5.0 datasets and compared it to the results of five other baseline methods. Our model CLSTM-SNP achieved a macro F1-score of 89.2 $$\%$$ % on CoNLL-2003 and 75.5$$\%$$ % on OntoNotes 5.0, respectively. The performance of CLSTM-SNP and LSTM-SNP indicates a great potential for handling named entity recognition or other sequential tasks in NLP.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call