Mamba for Time Series Analysis: A Contemporary Survey

Time series analysis (TSA)–forecasting, anomaly detection, imputation, classification, and unified multi-task analytics– has become a battleground for sequence-modeling backbones, where Transformers, MLPs, convolutions, and state-space models compete on long context benchmarks. Since the 2023 release of Mamba, its linear time recurrence and input-dependent selectivity have triggered a surge of Mamba-based time-series models that report strong numbers yet resist like-for-like comparison. This survey provides the first focused treatment of Mamba for time series analysis, organized around four orthogonal perspectives: Model, Task, Data, and Application. The Model perspective formalizes a five-axis design space– tokenization, channel strategy, directional scan, hybridization, and decomposition– together with a three-pattern architectural toolbox of pure, bidirectional, and hybrid designs. The Task perspective re-indexes the corpus across the five canonical TSA tasks. The Data perspective re-tags it by data shape, spanning univariate, multivariate, spatio-temporal graph, and irregular series. The Application perspective then surveys deployments in healthcare, energy, traffic, climate, finance, activity recognition, and foundation-scale settings. Building on these views, we distill practical guidelines for variant selection, training recipes, and Mamba-specific pitfalls. We also catalog public implementations, datasets, and metrics, and we map out open frontiers in gain attribution, modeling regimes, and cross-task unification. An online repository is maintained at https://github.com/tamlhp/awesome-mamba-ts.

Liked Liked