Accurate and rapid translation is conducive to the cultural communication of different languages. This paper briefly introduces the long short-term memory (LSTM) algorithm. To enhance the performance of the LSTM algorithm, semantic features were introduced, and semantic similarity was used to screen the translations that are more in line with the semantics of the source text. Then, simulation experiments were conducted. The experiments first examined the effects of the quantity of hidden layer nodes and the type of activation function in LSTM on the translation performance. Then, the LSTM algorithm was compared with the recurrent neural network (RNN) and traditional LSTM algorithms.The proposed translation algorithm showed the best performance when there were 512 hidden layer nodes and the activation function was sigmoid, it performed better than the other two translation algorithms, and the obtained result was consistent with the semantic meaning of the source text and smooth.
Read full abstract