Agentic AI for Digital Twin
Alexander Timms, Abigail Langbridge, et al.
AAAI 2025
In this work we propose for the first time a transformer-based framework for unsupervised representation learning of multivariate time series. Pre-trained models can be potentially used for downstream tasks such as regression and classification, forecasting and missing value imputation. By evaluating our models on several benchmark datasets for multivariate time series regression and classification, we show that our modeling approach represents the most successful method employing unsupervised learning of multivariate time series presented to date; it is also the first unsupervised approach shown to exceed the current state-of-the-art performance of supervised methods. It does so by a significant margin, even when the number of training samples is very limited, while offering computational efficiency. Finally, we demonstrate that unsupervised pre-training of our transformer models offers a substantial performance benefit over fully supervised learning, even without leveraging additional unlabeled data, i.e., by reusing the same data samples through the unsupervised objective.
Alexander Timms, Abigail Langbridge, et al.
AAAI 2025
Dzung Phan, Lam Nguyen, et al.
SDM 2024
Amadou Ba, Christopher Lohse, et al.
INFORMS 2022
Pin-Yu Chen, Cho-Jui Hsieh, et al.
KDD 2021