Advanced Search
    Zhuang Liansheng, Lü Yang, Yang Jian, Li Houqiang. Long Term Recurrent Neural Network with State-Frequency Memory[J]. Journal of Computer Research and Development, 2019, 56(12): 2641-2648. doi: 10.7544/issn1000-1239.2019.20180474
    Citation: Zhuang Liansheng, Lü Yang, Yang Jian, Li Houqiang. Long Term Recurrent Neural Network with State-Frequency Memory[J]. Journal of Computer Research and Development, 2019, 56(12): 2641-2648. doi: 10.7544/issn1000-1239.2019.20180474

    Long Term Recurrent Neural Network with State-Frequency Memory

    • Modeling time series has become one of the research hotspots in the field of machine learning because of its important application value. Recurrent neural network (RNN) is a crucial tool for modeling time series in recent years. However, existing RNNs are commonly hard to learn long-term dependency in the temporal domain and unable to model the frequency patterns in time series. The two problems seriously limit the performance of existing RNNs for the time series that contain long-term dependencies and rich frequency components. To solve these problems, we propose the long term recurrent neural network with state-frequency memory (LTRNN-SFM), which allows the network to model the uncovered features in both frequency and temporal domains by replacing state vector of the hidden layer in conventional RNNs to state-frequency matrix. Meanwhile, the proposed network can effectively avoid the interference of the gradient vanishing and exploding problems by separating neurons in the same layer, using activation functions such as rectified linear unit (ReLU) and clipping weight. In this way, a LTRNN-SFM with long-term memory and multiple layers can be trained easily. Experimental results have demonstrated that the proposed network achieves the best performance in processing time series with long-term dependencies and rich frequency components.
    • loading

    Catalog

      /

      DownLoad:  Full-Size Img  PowerPoint
      Return
      Return