Abstract:
Non-stationary multivariate time series (NSMTS) forecasting is still a challenging issue nowadays. The existing deep learning models based on recurrent neural networks (RNNs), especially long short-term memory (LSTM) and gated recurrent unit (GRU) neural networks, have received impressive performance in prediction. Although the architecture of the LSTM is relatively complex, it cannot always dominate in performance. Latest researches show that with a simpler gated unit structure, the minimal gated unit (MGU) can not only simplify the network architecture, but also improve the training efficiency in computer vision and some sequence problems. Most importantly, our experiments show that this kind of unit can be effectively applied to the NSMTS predictions and achieve comparable results with LSTM and MGU neural networks. However, none of the three gated unit based neural networks can always dominate in performance over all the NSMTS. Therefore, in this paper we propose a novel linear MIX gated unit (MIXGU). This gated unit can adjust the importance weights of GRU and MGU dynamically to achieve a better hybrid structure for each MIXGU in the network during training. The experimental results show that this MIXGU neural network has higher prediction performance than other state-of-the-art one gated unit neural network models.