Recently, with the development of big data and deep learning techniques, human-computer dialogue technology has been emerged as a hot topic, which has attracted the attention from academia and industry. Massive application products based on human-computer dialogue technology appear in our lives and bring us great convenience, such as Apple Siri, Microsoft Cortana, and Huawei smart speaker. However, how to make dialogue system identify and understand user intent more accurately, is still a great challenge. This paper therefore proposes a novel method named IndRNN-Attention based on independently recurrent neural network (IndRNN) and word-level attention mechanism for user intent classification problem. Firstly, we encode the user input message text through Multi-layer IndRNN. Secondly, we use word-level attention mechanism to improve the contribution of domain-related words to encode user input message text, and generate final representation vector of the user input message text. Finally, we classify this representation vector through softmax layer and output classification result. We not only introduce the IndRNN in our approach to solve the problems of gradient vanishing and gradient explosion, but also integrate word-level attention mechanism to improve the quality of text representation. Experimental results show that the proposed IndRNN-Attention approach achieves 0.93 F\-macro value on the user intent classification task and outperforms the state-of-the-art approaches significantly.