publication . Preprint . 2018

Foundations of Sequence-to-Sequence Modeling for Time Series

Kuznetsov, Vitaly; Mariet, Zelda;
Open Access English
  • Published: 09 May 2018
Abstract
The availability of large amounts of time series data, paired with the performance of deep-learning algorithms on a broad class of problems, has recently led to significant interest in the use of sequence-to-sequence models for time series forecasting. We provide the first theoretical analysis of this time series forecasting framework. We include a comparison of sequence-to-sequence modeling to classical time series models, and as such our theory can serve as a quantitative guide for practitioners choosing between different modeling methodologies.
Subjects
free text keywords: Computer Science - Machine Learning, Computer Science - Artificial Intelligence, Statistics - Machine Learning
Related Organizations
Download from
43 references, page 1 of 3

[1] Marta Banbura, Domenico Giannone, and Lucrezia Reichlin. Large Bayesian vector auto regressions. Journal of Applied Econometrics, 25(1):71-92, 2010.

[2] Sumanta Basu and George Michailidis. Regularized estimation in sparse high-dimensional time series models. Ann. Statist., 43(4):1535-1567, 08 2015.

[3] Filippo Maria Bianchi, Enrico Maiorino, Michael C. Kampffmeyer, Antonello Rizzi, and Robert Jenssen. Recurrent Neural Networks for Short-Term Load Forecasting - An Overview and Comparative Analysis. Springer Briefs in Computer Science. Springer, 2017.

[4] Tim Bollerslev. Generalized autoregressive conditional heteroskedasticity. Journal of Econometrics, 31(3):307 - 327, 1986. [OpenAIRE]

[5] George Edward Pelham Box and Gwilym Jenkins. Time Series Analysis, Forecasting and Control. Holden-Day, Incorporated, 1990.

[6] Peter J Brockwell and Richard A Davis. Time Series: Theory and Methods. Springer-Verlag New York, Inc., New York, NY, USA, 1986.

[7] P. Doukhan. Mixing: Properties and Examples. Lecture notes in statistics. Springer, 1994.

[8] Robert Engle. Autoregressive conditional heteroscedasticity with estimates of the variance of united kingdom inflation. Econometrica, 50(4):987-1007, 1982.

[9] Valentin Flunkert, David Salinas, and Jan Gasthaus. Deepar: Probabilistic forecasting with autoregressive recurrent networks. Arxiv:1704.04110, 2017. [OpenAIRE]

[10] Mahsa Ghafarianzadeh and Claire Monteleoni. Climate prediction via matrix completion. In Late-Breaking Developments in the Field of Artificial Intelligence, volume WS-13-17 of AAAI Workshops. AAAI, 2013.

[11] Hardik Goel, Igor Melnyk, and Arindam Banerjee. R2N2: Residual recurrent neural networks for multivariate time series forecasting. arXiv:1709.03159, 2017.

[12] James Douglas Hamilton. Time series analysis. Princeton Univ. Press, Princeton, NJ, 1994.

[13] Fang Han, Huanran Lu, and Han Liu. A direct estimation of high dimensional stationary vector autoregressions. Journal of Machine Learning Research, 16:3115-3150, 2015.

[14] Fang Han, Sheng Xu, and Han Liu. Rate optimal estimation of high dimensional time series. Technical report, Technical Report, Johns Hopkins University, 2015.

[15] Sepp Hochreiter and Jürgen Schmidhuber. Long short-term memory. Neural Comput., 9(8):1735-1780, November 1997. ISSN 0899-7667.

43 references, page 1 of 3
Abstract
The availability of large amounts of time series data, paired with the performance of deep-learning algorithms on a broad class of problems, has recently led to significant interest in the use of sequence-to-sequence models for time series forecasting. We provide the first theoretical analysis of this time series forecasting framework. We include a comparison of sequence-to-sequence modeling to classical time series models, and as such our theory can serve as a quantitative guide for practitioners choosing between different modeling methodologies.
Subjects
free text keywords: Computer Science - Machine Learning, Computer Science - Artificial Intelligence, Statistics - Machine Learning
Related Organizations
Download from
43 references, page 1 of 3

[1] Marta Banbura, Domenico Giannone, and Lucrezia Reichlin. Large Bayesian vector auto regressions. Journal of Applied Econometrics, 25(1):71-92, 2010.

[2] Sumanta Basu and George Michailidis. Regularized estimation in sparse high-dimensional time series models. Ann. Statist., 43(4):1535-1567, 08 2015.

[3] Filippo Maria Bianchi, Enrico Maiorino, Michael C. Kampffmeyer, Antonello Rizzi, and Robert Jenssen. Recurrent Neural Networks for Short-Term Load Forecasting - An Overview and Comparative Analysis. Springer Briefs in Computer Science. Springer, 2017.

[4] Tim Bollerslev. Generalized autoregressive conditional heteroskedasticity. Journal of Econometrics, 31(3):307 - 327, 1986. [OpenAIRE]

[5] George Edward Pelham Box and Gwilym Jenkins. Time Series Analysis, Forecasting and Control. Holden-Day, Incorporated, 1990.

[6] Peter J Brockwell and Richard A Davis. Time Series: Theory and Methods. Springer-Verlag New York, Inc., New York, NY, USA, 1986.

[7] P. Doukhan. Mixing: Properties and Examples. Lecture notes in statistics. Springer, 1994.

[8] Robert Engle. Autoregressive conditional heteroscedasticity with estimates of the variance of united kingdom inflation. Econometrica, 50(4):987-1007, 1982.

[9] Valentin Flunkert, David Salinas, and Jan Gasthaus. Deepar: Probabilistic forecasting with autoregressive recurrent networks. Arxiv:1704.04110, 2017. [OpenAIRE]

[10] Mahsa Ghafarianzadeh and Claire Monteleoni. Climate prediction via matrix completion. In Late-Breaking Developments in the Field of Artificial Intelligence, volume WS-13-17 of AAAI Workshops. AAAI, 2013.

[11] Hardik Goel, Igor Melnyk, and Arindam Banerjee. R2N2: Residual recurrent neural networks for multivariate time series forecasting. arXiv:1709.03159, 2017.

[12] James Douglas Hamilton. Time series analysis. Princeton Univ. Press, Princeton, NJ, 1994.

[13] Fang Han, Huanran Lu, and Han Liu. A direct estimation of high dimensional stationary vector autoregressions. Journal of Machine Learning Research, 16:3115-3150, 2015.

[14] Fang Han, Sheng Xu, and Han Liu. Rate optimal estimation of high dimensional time series. Technical report, Technical Report, Johns Hopkins University, 2015.

[15] Sepp Hochreiter and Jürgen Schmidhuber. Long short-term memory. Neural Comput., 9(8):1735-1780, November 1997. ISSN 0899-7667.

43 references, page 1 of 3
Powered by OpenAIRE Research Graph
Any information missing or wrong?Report an Issue