ISSN 1000-1239 CN 11-1777/TP

计算机研究与发展 ›› 2019, Vol. 56 ›› Issue (8): 1642-1651.doi: 10.7544/issn1000-1239.2019.20190326

所属专题: 2019人工智能前沿进展专题

• 人工智能 • 上一篇    下一篇

基于混合门单元的非平稳时间序列预测

刘颉羲,陈松灿   

  1. (模式分析与机器智能工业和信息化部重点实验室(南京航空航天大学) 南京 211106) (liujiexi@nuaa.edu.cn)
  • 出版日期: 2019-08-01
  • 基金资助: 
    国家自然科学基金项目(61732006)

Non-Stationary Multivariate Time Series Prediction with MIX Gated Unit

Liu Jiexi, Chen Songcan   

  1. (MIIT Key Laboratory of Pattern Analysis and Machine Intelligence (Nanjing University of Aeronautics and Astronautics), Nanjing 211106)
  • Online: 2019-08-01

摘要: 非平稳多变量时间序列(non-stationary multivariate time series, NSMTS)预测目前仍是一个具有挑战性的任务.基于循环神经网络的深度学习模型,尤其是基于长短期记忆(long short-term memory, LSTM)和门循环单元(gated recurrent unit, GRU)的神经网络已获得了令人印象深刻的预测性能.尽管LSTM结构上较为复杂,却并不总是在性能上占优.最近提出的最小门单元(minimal gated unit, MGU)神经网络具有更简单的结构,并在图像处理和一些序列处理问题中能够提升训练效率.更为关键的是,实验中我们发现该门单元可以高效运用于NSMTS的预测,并达到了与基于LSTM和GRU的神经网络相当的预测性能.然而,基于这3类门单元的神经网络中,没有任何一类总能保证性能上的优势.为此提出了一种线性混合门单元(MIX gated unit, MIXGU),试图利用该单元动态调整GRU和MGU的混合权重,以便在训练期间为网络中的每个MIXGU获得更优的混合结构.实验结果表明,与基于单一门单元的神经网络相比,混合2类门单元的MIXGU神经网络具有更优的预测性能.

关键词: 非平稳多变量时间序列, 循环神经网络, 长短期记忆, 门循环单元, 最小门单元, 混合门单元

Abstract: Non-stationary multivariate time series (NSMTS) forecasting is still a challenging issue nowadays. The existing deep learning models based on recurrent neural networks (RNNs), especially long short-term memory (LSTM) and gated recurrent unit (GRU) neural networks, have received impressive performance in prediction. Although the architecture of the LSTM is relatively complex, it cannot always dominate in performance. Latest researches show that with a simpler gated unit structure, the minimal gated unit (MGU) can not only simplify the network architecture, but also improve the training efficiency in computer vision and some sequence problems. Most importantly, our experiments show that this kind of unit can be effectively applied to the NSMTS predictions and achieve comparable results with LSTM and MGU neural networks. However, none of the three gated unit based neural networks can always dominate in performance over all the NSMTS. Therefore, in this paper we propose a novel linear MIX gated unit (MIXGU). This gated unit can adjust the importance weights of GRU and MGU dynamically to achieve a better hybrid structure for each MIXGU in the network during training. The experimental results show that this MIXGU neural network has higher prediction performance than other state-of-the-art one gated unit neural network models.

Key words: non-stationary multivariate time series (NSMTS), recurrent neural networks (RNNs), long short-term memory (LSTM), gated recurrent unit (GRU), minimal gated unit (MGU), MIX gated unit (MIXGU)

中图分类号: