Alessandro Pomponio
Kubecon + CloudNativeCon NA 2025
Foundation models (FMs) have transformed natural language processing (NLP), but their successes have not yet translated to the time series domain. Existing time series foundation models (TSFMs) struggle with generalization across varying context and target lengths, lack adaptability to different sampling rates, and are computationally inefficient. We introduce FlowState, a novel TSFM architecture that addresses these challenges through two key innovations: a state space model (SSM) based encoder and a functional basis decoder. This design enables continuous-time modeling, adjustment to various sampling rates, and flexible forecasting horizons without retraining. We further propose a parallel training strategy that enhances robustness and accelerates training. Despite being the smallest model, FlowState achieves state-of-the-art results on the GIFT-ZS and the Chronos-ZS benchmarks, while demonstrating superior adaptability to unseen sampling rates.
Alessandro Pomponio
Kubecon + CloudNativeCon NA 2025
Jie Ren, Zhenwei Dai, et al.
NeurIPS 2025
Tian Gao, Amit Dhurandhar, et al.
NeurIPS 2025
Jose Manuel Bernabe' Murcia, Eduardo Canovas Martinez, et al.
MobiSec 2024