publication . Preprint . 2018

An interpretable LSTM neural network for autoregressive exogenous model

Guo, Tian; Lin, Tao; Lu, Yao;
Open Access English
  • Published: 14 Apr 2018
Abstract
In this paper, we propose an interpretable LSTM recurrent neural network, i.e., multi-variable LSTM for time series with exogenous variables. Currently, widely used attention mechanism in recurrent neural networks mostly focuses on the temporal aspect of data and falls short of characterizing variable importance. To this end, our multi-variable LSTM equipped with tensorized hidden states is developed to learn variable specific representations, which give rise to both temporal and variable level attention. Preliminary experiments demonstrate comparable prediction performance of multi-variable LSTM w.r.t. encoder-decoder based baselines. More interestingly, variab...
Subjects
free text keywords: Computer Science - Learning, Statistics - Machine Learning
Download from

Christopher Meek, David Maxwell Chickering, and David Heckerman. Autoregressive tree models for time-series analysis. In SDM, pp. 229-244. SIAM, 2002.

Volodymyr Mnih, Nicolas Heess, Alex Graves, et al. Recurrent models of visual attention. In Advances in neural information processing systems, pp. 2204-2212, 2014. [OpenAIRE]

Yao Qin, Dongjin Song, Haifeng Cheng, Wei Cheng, Guofei Jiang, and Garrison W. Cottrell. A dual-stage attention-based recurrent neural network for time series prediction. In Proceedings of the 26th International Joint Conference on Artificial Intelligence, IJCAI'17, pp. 2627-2633. AAAI Press, 2017.

Alexander M Rush, Sumit Chopra, and Jason Weston. A neural attention model for abstractive sentence summarization. arXiv preprint arXiv:1509.00685, 2015.

Elsa Siggiridou and Dimitris Kugiumtzis. Granger causality in multivariate time series using a timeordered restricted vector autoregressive model. IEEE Transactions on Signal Processing, 64(7): 1759-1773, 2016.

Linlin Wang, Zhu Cao, Yu Xia, and Gerard de Melo. Morphological segmentation with window lstm neural networks. In AAAI, 2016.

Kelvin Xu, Jimmy Ba, Ryan Kiros, Kyunghyun Cho, Aaron Courville, Ruslan Salakhudinov, Rich Zemel, and Yoshua Bengio. Show, attend and tell: Neural image caption generation with visual attention. In International Conference on Machine Learning, pp. 2048-2057, 2015.

Liheng Zhang, Charu Aggarwal, and Guo-Jun Qi. Stock price prediction via discovering multifrequency trading patterns. In Proceedings of the 23rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 2141-2149. ACM, 2017.

Xiabing Zhou, Wenhao Huang, Ni Zhang, Weisong Hu, Sizhen Du, Guojie Song, and Kunqing Xie. Probabilistic dynamic causal model for temporal data. In Neural Networks (IJCNN), 2015 International Joint Conference on, pp. 1-8. IEEE, 2015.

100 Bathroom tem p.

100 Living room tem p.

80 80 80 80 itsyenD4600 4600 4600 4600

20 20 20 20 00.0 0A.1tte0n.t2ion0.v3alu0e.4 0.5 00.0 0A.1tte0n.t2ion0.v3alu0e.4 0.5 00.0 0A.1tte0n.t2ion0.v3alu0e.4 0.5 00.0 0A.1tte0n.t2ion0.v3alu0e.4 0.5

Powered by OpenAIRE Open Research Graph
Any information missing or wrong?Report an Issue