TiDE: Time-series Dense Encoder
Long-term Forecasting with TiDE: Time-series Dense Encoder
TiDE uses a multi-layer perceptron (MLP), which is a type of neural network with multiple hidden layers. The multi-layer perceptron (MLP) based encoder-decoder model called Time-series Dense Encoder (TiDE) for long-term time series forecasting. The encoder maps the input time series into a latent representation, while the decoder generates the future forecasted values
TiDE (Time-series Dense Encoder) as it encodes the past of a time-series along with covariates using dense MLP’s and then decodes the encoded time-series along with future covariates.
Encoding
encoding step is to map the past and the covariates of a time-series to a dense representation of the features
Decoding
The decoding in our model maps the encoded hidden representations into future predictions of time series.
The encoder maps the input time series into a latent representation, while the decoder generates the future forecasted values
TiDE’s architecture is the use of residual connections. These are essentially shortcuts that bypass one or more layers in the neural network. By doing so, they facilitate smoother gradient flow during training, making it easier for the model to learn and reducing the risk of vanishing gradient problems. Residual connections also introduce a form of model regularization, potentially preventing overfitting.
simplest linear version of TiDE can achieve near-optimal error rates for linear dynamical systems (LDS), a commonly used class of time series models
TiDE is more than 10x faster in training compared to transformer-based baselines while being more accurate on benchmarks. Similar gains can be observed in inference as it only scales linearly with the length of the context (the number of time-steps the model looks back) and the prediction horizon. Below on the left, we show that our model can be 10.6% better than the best transformer-based baseline (PatchTST) on a popular traffic forecasting benchmark, in terms of test mean squared error (MSE
Reference
https://arxiv.org/pdf/2304.08424.pdf