Skip to content

Geometric Analysis of Transformer Time Series Forecasting Latent Manifolds

License

Notifications You must be signed in to change notification settings

azencot-group/GATLM

Repository files navigation

GATLM

Geometric Analysis of Transformer Time Series Forecasting Latent Manifolds

Our main results:

-Transformer forecasting manifolds exhibit two phases dimensionality and curvature drop or remain fixed during encoding, then increase during decoding.

-This behavior is consistent across architectures and datasets.

-The MAPC estimate correlates with test mean squared error, enabling model comparison without the test set.

-Geometric properties of the manifolds stabilize within a few training epochs.

Training

In the repository, you can find a training script for Autoformer and FEDformer on the following datasets: ETTm1, ETTh1, ETTm2, ETTh2, Electricity, Traffic and Weather. To run the training process run the following command:

python train.py 

You can train all the models by running the following shell code separately:

bash ./scripts/run_M.sh

Intrinsic Dimension and Curvature evaluation

To estimate the intrinsic dimension and curvature of the latent representations, execute the following command:

python est_curv.py --task_id ETTm1

Paper

@article{
kaufman2024analyzing,
title={Analyzing Deep Transformer Models for Time Series Forecasting via Manifold Learning},
author={Ilya Kaufman and Omri Azencot},
journal={Transactions on Machine Learning Research},
issn={2835-8856},
year={2024},
url={https://openreview.net/forum?id=zRZe93OZho}
}

About

Geometric Analysis of Transformer Time Series Forecasting Latent Manifolds

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published