Chinese Language Model Adaptive Method Based on Recurrent Neural Network

Authors

  • Jiangjiang Li
    Zhengzhou University of Science and Technology
  • Jiaxiang Wang
    Zhengzhou University of Science and Technology
  • Lijuan Feng
    Zhengzhou University of Science and Technology
  • Yachao Zhang
    Zhengzhou University of Science and Technology

Keywords:

Deep learning, Recurrent neural network, Chinese language model, Self-adaption algorithm

Abstract

Deep learning is more and more widely used in natural language processing. Compared with the traditional n-gram statistical language model, Recurrent neural network (RNN) modeling technology has shown great advantages in language modeling, and has been gradually applied in speech recognition, machine translation and other fields. However, at present, the training of RNN language models is mostly offline. For different speech recognition tasks, there are language differences between training corpus and recognition tasks, which affects the recognition rate of speech recognition systems. While using RNN modeling technology to train the Chinese language model, an online RNN model self-adaption algorithm is proposed, which takes the preliminary recognition results of speech signals as corpus to continue training the model, so that the adaptive RNN model can get the maximum match with the recognition task. The experimental results show that the adaptive model effectively reduces the language difference between the language model and the recognition task, and the recognition rate of the system is further improved after the Chinese word confusion network is re-scored, which has been verified in the actual Chinese speech recognition system.

Downloads

Published

2025-02-21

Issue

Section

Articles

How to Cite

Li, J., Wang, J., Feng, L., & Zhang, Y. (2025). Chinese Language Model Adaptive Method Based on Recurrent Neural Network. IJLAI Transactions on Science and Engineering, 3(1), 29-34. https://sub.ifspress.hk/IJLAI/article/view/144

Most read articles by the same author(s)