Abstract

In the past decade, the main natural language processing technologies in the field of artificial intelligence are Word2Vec and ELMO traditional models in the application of intelligent legal systems. For the reason that they are basically one-way training algorithms from left to right and only one-way information is learned, so these traditional models have some disadvantages such as low efficiency and accuracy. In order to identify specific elements in the legal case intelligently, such as time, location, perpetrator, and recipient, and improve the efficiency of case processing, a new entity recognition method using the BERT (Bidirectional Encoder Representations from Transformers) model as the input layer is proposed. The BERT model is a new type of word vector model that relies on context by joint adjusting the bidirectional Transformer in all layers. Basing on BERT model, we proposed a new method comprise BERT, BiLSTM and CRF (Conditional Random Fields) to carry on the intelligent identification of legal case entities. And with abundant experiment result, the better accuracy and efficiency of our method has been proved comparing to traditional models such as Word2Vec.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call