Initial commit
Browse filesCo-authored-by: Thomas Ortner <[email protected]>
Co-authored-by: Lars Graf <[email protected]>
Co-authored-by: Stanislaw Wozniak <[email protected]>
- README.md +86 -0
- config.json +32 -0
- figs/.DS_Store +0 -0
- figs/FlowState.png +3 -0
- figs/flowstate_performance.png +3 -0
- model.safetensors +3 -0
README.md
CHANGED
@@ -1,3 +1,89 @@
|
|
1 |
---
|
2 |
license: apache-2.0
|
3 |
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
---
|
2 |
license: apache-2.0
|
3 |
---
|
4 |
+
---
|
5 |
+
license: apache-2.0
|
6 |
+
---
|
7 |
+
# FlowState
|
8 |
+
[Paper](https://www.arxiv.org/abs/2508.05287) | [HuggingFace Model Card](https://huggingface.co/ibm-granite/granite-timeseries-flowstate-r1) | [GitHub Model Code](https://github.com/ibm-granite/granite-tsfm/tree/main/tsfm_public/models/flowstate)
|
9 |
+
|
10 |
+

|
11 |
+
FlowState is the first time-scale adjustable Time Series Foundation Model (TSFM), open-sourced by IBM Research.
|
12 |
+
Combining an State Space Model (SSM) Encoder with a Functional Basis Decoder allows FlowState to transition into a timescale invariant coefficient space and make a continuous forecast from this space.
|
13 |
+
This allows FlowState to seamlessly adjust to all possible sampling rates.
|
14 |
+
Therefore, training in one time-scale helps for inference at all scales, allowing for drastically improved utilization of training data across time-scales.
|
15 |
+
This innovation leads to a significant improvement in performance, making FlowState the new state-of-the art in zero-shot time series forecasting.
|
16 |
+
## Key Features
|
17 |
+
- **FlowState**: We present an SSM-based time series foundation model that can be dynamically adjusted to the specific characteristics of the time series during evaluation.
|
18 |
+
- **Functional Basis Decoder (FBD)**: We propose a novel decoder, as a critical component of FlowState, that utilizes a set of continuous basis functions to make continuous forecasts and allow seamless adjustment to specific input characteristics.
|
19 |
+
- **Flexible temporal adaptation**: FlowState can dynamically adjust the context and target length to the timescale of the provided time series.
|
20 |
+
- **Compact and high-performing**: With fewer than 10M parameters and the ability to forecast multiple consecutive patches in parallel, FlowState delivers state-of-the-art accuracy with exceptional efficiency.
|
21 |
+
|
22 |
+
This model card contains the model-weights for research-use only and full reproducibility of our results published in our [paper](https://www.arxiv.org/abs/2508.05287). However - if you are looking for the FlowState model weights for commercial and enterprise use, please refer to our granite releases [here](https://huggingface.co/ibm-granite/granite-timeseries-flowstate-r1)
|
23 |
+
|
24 |
+
## Benchmark Highlights
|
25 |
+

|
26 |
+
Despite being **more than 10x smaller** than the 3 next best models,
|
27 |
+
FlowState is the **best Zero-Shot model** on the [GIFT-Eval Leaderboard](https://huggingface.co/spaces/Salesforce/GIFT-Eval).
|
28 |
+
The Figure compares GIFT MASE Performance vs. model size for FlowState and the 10 next best Zero-Shot Models, as of Sep. 9th 2025.
|
29 |
+
## Model Details
|
30 |
+
Model Details can be found in our [Paper](https://www.arxiv.org/abs/2508.05287).
|
31 |
+
Currently FlowState only supports zero-shot forecasting.
|
32 |
+
## Recommended Use
|
33 |
+
FlowState can be used to make predictions as follows:
|
34 |
+
```Python
|
35 |
+
from tsfm_public import FlowStateForPrediction
|
36 |
+
import torch
|
37 |
+
device= 'cuda'
|
38 |
+
predictor = FlowStateForPrediction.from_pretrained("ibm-granite/granite-timeseries-flowstate-r1").to(device)
|
39 |
+
time_series = torch.randn((2048, 32, 1), device=device) # context, batch, n_ch
|
40 |
+
forecast = predictor(time_series, scale_factor=0.25, prediction_length=960, batch_first=False)
|
41 |
+
print(forecast.prediction_outputs.shape) # torch.Size([32, 9, 48, 1]) (batch, quantiles, forecast_length, n_ch)
|
42 |
+
```
|
43 |
+
It is recommended for users to determine a suitable scale factor for their specific time series data, as explained in the next section.
|
44 |
+
#### Temporal Scaling
|
45 |
+
For common sampling rates, we recommend the following scaling factors.
|
46 |
+
| Sampling Rate | Recommended Scale Factor |
|
47 |
+
|---------------|---------------------------|
|
48 |
+
| 15 min | 0.25 |
|
49 |
+
| 30 min | 0.5 |
|
50 |
+
| Hourly | 1.0 |
|
51 |
+
| Daily | 3.43 if data has a weekly cylce, else 0.0656 |
|
52 |
+
| Weekly | 0.46 |
|
53 |
+
| Monthly | 2 |
|
54 |
+
|
55 |
+
For optimal performance it is recommended to first determine the seasonality of their data and to calculate the scale factor.
|
56 |
+
|
57 |
+
Assuming data has repeating structures every N=96 time steps (such as quarter hourly sampled data with a daily cycle), resulting in seasonality 96, the scale factor can be calculated as follows:
|
58 |
+
|
59 |
+
scale_factor = Base Seasonality / N = 24 / 96 = 0.25
|
60 |
+
|
61 |
+
Where 24 is the base seasonality used during pretraining.
|
62 |
+
If the seasonality is unclear, it is best to experiment with different scale factors and select what works best.
|
63 |
+
We recommend forecasting no more than 30 seasons (in our example 96*30=2880 time steps).
|
64 |
+
Afterward, forecasting quality declines.
|
65 |
+
## Installation
|
66 |
+
To run FlowState follow the installation instructions [here](https://github.com/ibm-granite/granite-tsfm/?tab=readme-ov-file#initial-setup).
|
67 |
+
For the GIFT evaluation notebook we recommend using python 3.11, and installing gift-eval according to their [repo](https://github.com/SalesforceAIResearch/gift-eval).
|
68 |
+
## Example Recipes and Notebooks
|
69 |
+
- Getting started notebook: [here](https://github.com/ibm-granite/granite-tsfm/tree/main/notebooks/hfdemo/flowstate_getting_started.ipynb)
|
70 |
+
- GIFT Eval Notebook: [here](https://github.com/ibm-granite/granite-tsfm/tree/main/notebooks/hfdemo/flowstate_gift_eval.ipynb.)
|
71 |
+
## Pretraining Data
|
72 |
+
As pretraining data, we used a subset of [Gift-Eval Pretrain](https://huggingface.co/datasets/Salesforce/GiftEvalPretrain), and a subset of the [Chronos Pretraining Data Corpus](https://huggingface.co/datasets/autogluon/chronos_datasets).
|
73 |
+
None of the used datasets (or sub/up-sampled versions thereof) are contained in Gift-Eval (neither train, validation nor test split).
|
74 |
+
All our Gift-Eval results are Zero-Shot.
|
75 |
+
## Citation
|
76 |
+
Please cite the following paper if you intend to use our model or its associated architectures/approaches in your work.
|
77 |
+
### BibTeX:
|
78 |
+
```
|
79 |
+
@article{graf2025flowstate,
|
80 |
+
title={FlowState: Sampling Rate Invariant Time Series Forecasting},
|
81 |
+
author={Graf, Lars and Ortner, Thomas and Wo{\'L}{\c{s}}niak, Stanis{\'L} and Pantazi, Angeliki and others},
|
82 |
+
journal={arXiv preprint arXiv:2508.05287},
|
83 |
+
year={2025}
|
84 |
+
}
|
85 |
+
```
|
86 |
+
## Model Card Authors
|
87 |
+
Lars Graf, Thomas Ortner, Stanislaw Wozniak, Angeliki Pantazi
|
88 |
+
## IBM Public Repository Disclosure
|
89 |
+
All content in this repository including code has been provided by IBM under the associated open source software license and IBM is under no obligation to provide enhancements, updates, or support. IBM developers produced this code as an open source project (not as an IBM product), and IBM makes no assertions as to the level of quality nor security, and will not be maintaining this code going forward.
|
config.json
ADDED
@@ -0,0 +1,32 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"architectures": [
|
3 |
+
"FlowStateModel"
|
4 |
+
],
|
5 |
+
"context_length": 2048,
|
6 |
+
"decoder_dim": 256,
|
7 |
+
"decoder_patch_len": 24,
|
8 |
+
"decoder_type": "legs",
|
9 |
+
"embedding_feature_dim": 512,
|
10 |
+
"encoder_num_hippo_blocks": 8,
|
11 |
+
"encoder_num_layers": 6,
|
12 |
+
"encoder_state_dim": 512,
|
13 |
+
"init_processing": true,
|
14 |
+
"prediction_type": "quantile",
|
15 |
+
"min_context": 2048,
|
16 |
+
"model_type": "flowstate",
|
17 |
+
"quantiles": [
|
18 |
+
0.1,
|
19 |
+
0.2,
|
20 |
+
0.3,
|
21 |
+
0.4,
|
22 |
+
0.5,
|
23 |
+
0.6,
|
24 |
+
0.7,
|
25 |
+
0.8,
|
26 |
+
0.9
|
27 |
+
],
|
28 |
+
"torch_dtype": "float32",
|
29 |
+
"transformers_version": "4.52.1",
|
30 |
+
"use_freq": true,
|
31 |
+
"with_missing": true
|
32 |
+
}
|
figs/.DS_Store
ADDED
Binary file (6.15 kB). View file
|
|
figs/FlowState.png
ADDED
![]() |
Git LFS Details
|
figs/flowstate_performance.png
ADDED
![]() |
Git LFS Details
|
model.safetensors
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:c48a03b81d3152eefd7d538ad47bfec7bc6bdfeca11203dc1a273be2cb425446
|
3 |
+
size 36284680
|