This authors' official PyTorch implementation for paper: Streaming Factor Trajectory Learning for Temporal Tensor Decomposition(NeurIPS 2023)
To model the temporal tensor data, we assign each tensor factor with Temporal-Gaussian Process priors and model it as the continuous-time-varing trajectory.
The inference of the factor trajectory is in a streaming and online manner.
Example of learned functional frajectories of factors from real-world data.
The project is mainly built with pytorch 1.10.1 under python 3. Besides that, make sure to install tqdm and tensorly before running the project.
- Clone this repository.
- To play with the code quickly with visulization of factors, we offer several notebooks at
notebook
(on synthetic & real data) - To run the real-world datasets with scripts, see
my_script.sh
for example. - To tune the (hyper)parametrs of model, modify the
.yaml
files inconfig
folder - To apply the model on your own dataset, please follow the
./data/process_script/beijing_15k_process.ipynb
or./data/synthetic/simu_data_generate_CP_r1.ipynb
to process the raw data into appropriate format.
Please check our paper for more details.
Please cite our work if you would like it
@misc{fang2023streaming,
title={Streaming Factor Trajectory Learning for Temporal Tensor Decomposition},
author={Shikai Fang and Xin Yu and Shibo Li and Zheng Wang and Robert Kirby and Shandian Zhe},
year={2023},
eprint={2310.17021},
archivePrefix={arXiv},
primaryClass={cs.LG}
}