高级检索
    张志昌, 张珍文, 张治满. 基于IndRNN-Attention的用户意图分类[J]. 计算机研究与发展, 2019, 56(7): 1517-1524. DOI: 10.7544/issn1000-1239.2019.20180648
    引用本文: 张志昌, 张珍文, 张治满. 基于IndRNN-Attention的用户意图分类[J]. 计算机研究与发展, 2019, 56(7): 1517-1524. DOI: 10.7544/issn1000-1239.2019.20180648
    Zhang Zhichang, Zhang Zhenwen, Zhang Zhiman. User Intent Classification Based on IndRNN-Attention[J]. Journal of Computer Research and Development, 2019, 56(7): 1517-1524. DOI: 10.7544/issn1000-1239.2019.20180648
    Citation: Zhang Zhichang, Zhang Zhenwen, Zhang Zhiman. User Intent Classification Based on IndRNN-Attention[J]. Journal of Computer Research and Development, 2019, 56(7): 1517-1524. DOI: 10.7544/issn1000-1239.2019.20180648

    基于IndRNN-Attention的用户意图分类

    User Intent Classification Based on IndRNN-Attention

    • 摘要: 针对人机对话中的用户意图分类问题,提出了一种基于独立循环神经网络(independently recurrent neural network, IndRNN)和词级别注意力(word-level attention)融合的用户意图分类方法.通过构造一个多层独立循环神经网络模型实现对用户输入文本编码,有效解决了循环神经网络中容易出现的梯度消失和梯度爆炸问题;结合词级别注意力提高了领域相关词汇对用户输入文本编码的贡献度,有效提高了分类精度.实验结果表明:提出的方法在用户意图分类任务上的效果取得了显著的提升.

       

      Abstract: Recently, with the development of big data and deep learning techniques, human-computer dialogue technology has been emerged as a hot topic, which has attracted the attention from academia and industry. Massive application products based on human-computer dialogue technology appear in our lives and bring us great convenience, such as Apple Siri, Microsoft Cortana, and Huawei smart speaker. However, how to make dialogue system identify and understand user intent more accurately, is still a great challenge. This paper therefore proposes a novel method named IndRNN-Attention based on independently recurrent neural network (IndRNN) and word-level attention mechanism for user intent classification problem. Firstly, we encode the user input message text through Multi-layer IndRNN. Secondly, we use word-level attention mechanism to improve the contribution of domain-related words to encode user input message text, and generate final representation vector of the user input message text. Finally, we classify this representation vector through softmax layer and output classification result. We not only introduce the IndRNN in our approach to solve the problems of gradient vanishing and gradient explosion, but also integrate word-level attention mechanism to improve the quality of text representation. Experimental results show that the proposed IndRNN-Attention approach achieves 0.93 F\-macro value on the user intent classification task and outperforms the state-of-the-art approaches significantly.

       

    /

    返回文章
    返回