ISSN 1000-1239 CN 11-1777/TP

Journal of Computer Research and Development ›› 2014, Vol. 51 ›› Issue (9): 1936-1944.doi: 10.7544/issn1000-1239.2014.20140211

Special Issue: 2014深度学习

Previous Articles     Next Articles

A Study of Speech Recognition Based on RNN-RBM Language Model

Li Yaxiong1, Zhang Jianqiang2, Pan Deng3, Hu Dan4   

  1. 1(Network Management Center, Hubei University of Science and Technology, Xianning, Hubei 437100);2(Information Technology Department of Learning Sciences & Technologies Virginia Polytechnic and State University, Blacksburg, USA VA24061);3(School of Foreign Languages, Hubei University of Science and Technology, Xianning, Hubei 437100);4(School of Foreign Languages, Zhongnan University of Economics and Law, Wuhan 430073)
  • Online:2014-09-01

Abstract: In the recent years, deep learning is emerging as a new way of multilayer neural networks and back propagation training. Its application in the field of language model, such as restricted Boltzmann machine language model, gets good results. This language model based on neural network can assess the probability of the next word appears according to the word sequence which is mapped to a continuous space. This language model can solve the problem of sparse data. Besides, some scholars are constructing language model making use of recurrent neural network mode in order to make full use of the preceding text to predict the next words. From these models we can sort out the restriction of long-distance dependency in language. This paper attempts to catch the long-distance information based on RNN-RBM. On the other hand, the dynamic adjunction of language model ia analyzed and illustrated according to the language features. The experimental result manifests there are considerable improvement to the efficiency of expanding vocabulary continuing speech recognition using RNN_RBM language model.

Key words: speech recognition, language model, neural network, recurrent neural network-restricted Boltzmann machine, relevance information

CLC Number: