Nina Zukowska
commited on
Commit
•
b0ae575
1
Parent(s):
544c9a2
moment base files added
Browse files- README.md +156 -0
- config.json +38 -0
- model.safetensors +3 -0
- pytorch_model.bin +3 -0
README.md
ADDED
@@ -0,0 +1,156 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
license: mit
|
3 |
+
datasets:
|
4 |
+
- AutonLab/Timeseries-PILE
|
5 |
+
metrics:
|
6 |
+
- accuracy
|
7 |
+
- mse
|
8 |
+
- mae
|
9 |
+
- f1
|
10 |
+
tags:
|
11 |
+
- time series
|
12 |
+
- forecasting
|
13 |
+
- classification
|
14 |
+
- anomaly detection
|
15 |
+
- imputation
|
16 |
+
- transformers
|
17 |
+
- pretrained models
|
18 |
+
- foundation models
|
19 |
+
- time-series
|
20 |
+
pipeline_tag: time-series-forecasting
|
21 |
+
---
|
22 |
+
# MOMENT-base
|
23 |
+
|
24 |
+
MOMENT is a family of foundation models for general-purpose time-series analysis. The models in this family (1) serve as a building block for diverse **time-series analysis tasks** (e.g., forecasting, classification, anomaly detection, and imputation, etc.), (2) are effective **out-of-the-box**, i.e., with no (or few) task-specific exemplars (enabling e.g., zero-shot forecasting, few-shot classification, etc.), and (3) are **tunable** using in-distribution and task-specific data to improve performance.
|
25 |
+
|
26 |
+
For details on MOMENT models, training data, and experimental results, please refer to the paper [MOMENT: A Family of Open Time-series Foundation Models](https://arxiv.org/pdf/2402.03885.pdf).
|
27 |
+
|
28 |
+
# Usage
|
29 |
+
|
30 |
+
**Recommended Python Version:** Python 3.11 (support for additional versions is expected soon).
|
31 |
+
|
32 |
+
You can install the `momentfm` package using pip:
|
33 |
+
```bash
|
34 |
+
pip install momentfm
|
35 |
+
```
|
36 |
+
Alternatively, to install the latest version directly from the GitHub repository:
|
37 |
+
```bash
|
38 |
+
pip install git+https://github.com/moment-timeseries-foundation-model/moment.git
|
39 |
+
```
|
40 |
+
|
41 |
+
|
42 |
+
To load the pre-trained model for one of the tasks, use one of the following code snippets:
|
43 |
+
|
44 |
+
**Forecasting**
|
45 |
+
```python
|
46 |
+
from moment import MOMENTPipeline
|
47 |
+
|
48 |
+
model = MOMENTPipeline.from_pretrained(
|
49 |
+
"AutonLab/MOMENT-1-base",
|
50 |
+
model_kwargs={
|
51 |
+
'task_name': 'forecasting',
|
52 |
+
'forecast_horizon': 96
|
53 |
+
},
|
54 |
+
)
|
55 |
+
model.init()
|
56 |
+
```
|
57 |
+
|
58 |
+
**Classification**
|
59 |
+
```python
|
60 |
+
from moment import MOMENTPipeline
|
61 |
+
|
62 |
+
model = MOMENTPipeline.from_pretrained(
|
63 |
+
"AutonLab/MOMENT-1-base",
|
64 |
+
model_kwargs={
|
65 |
+
'task_name': 'classification',
|
66 |
+
'n_channels': 1,
|
67 |
+
'num_class': 2
|
68 |
+
},
|
69 |
+
)
|
70 |
+
model.init()
|
71 |
+
```
|
72 |
+
|
73 |
+
**Anomaly Detection, Imputation, and Pre-training**
|
74 |
+
```python
|
75 |
+
from moment import MOMENTPipeline
|
76 |
+
|
77 |
+
model = MOMENTPipeline.from_pretrained(
|
78 |
+
"AutonLab/MOMENT-1-base",
|
79 |
+
model_kwargs={"task_name": "reconstruction"},
|
80 |
+
)
|
81 |
+
mode.init()
|
82 |
+
```
|
83 |
+
|
84 |
+
**Representation Learning**
|
85 |
+
```python
|
86 |
+
from moment import MOMENTPipeline
|
87 |
+
|
88 |
+
model = MOMENTPipeline.from_pretrained(
|
89 |
+
"AutonLab/MOMENT-1-base",
|
90 |
+
model_kwargs={'task_name': 'embedding'},
|
91 |
+
)
|
92 |
+
```
|
93 |
+
|
94 |
+
### Tutorials
|
95 |
+
Here is the list of tutorials and reproducibile experiments to get started with MOMENT for various tasks:
|
96 |
+
- [Forecasting](https://github.com/moment-timeseries-foundation-model/moment/blob/main/tutorials/forecasting.ipynb)
|
97 |
+
- [Classification](https://github.com/moment-timeseries-foundation-model/moment/blob/main/tutorials/classification.ipynb)
|
98 |
+
- [Anomaly Detection](https://github.com/moment-timeseries-foundation-model/moment/blob/main/tutorials/anomaly_detection.ipynb)
|
99 |
+
- [Imputation](https://github.com/moment-timeseries-foundation-model/moment/blob/main/tutorials/imputation.ipynb)
|
100 |
+
- [Representation Learning](https://github.com/moment-timeseries-foundation-model/moment/blob/main/tutorials/representation_learning.ipynb)
|
101 |
+
- [Real-world Electrocardiogram (ECG) Case Study](https://github.com/moment-timeseries-foundation-model/moment/blob/main/tutorials/ptbxl_classification.ipynb) -- This tutorial also shows how to fine-tune MOMENT for a real-world ECG classification problem, performing training and inference on multiple GPUs and parameter efficient fine-tuning (PEFT).
|
102 |
+
|
103 |
+
## Model Details
|
104 |
+
|
105 |
+
### Model Description
|
106 |
+
|
107 |
+
- **Developed by:** [Auton Lab](https://autonlab.org/), [Carnegie Mellon University](https://www.cmu.edu/) and [University of Pennsylvania](https://www.upenn.edu/)
|
108 |
+
- **Model type:** Time-series Foundation Model
|
109 |
+
- **License:** MIT License
|
110 |
+
|
111 |
+
### Model Sources
|
112 |
+
|
113 |
+
<!-- Provide the basic links for the model. -->
|
114 |
+
|
115 |
+
- **Repository:** https://github.com/moment-timeseries-foundation-model/ (Pre-training and research code coming out soon!)
|
116 |
+
- **Paper:** https://arxiv.org/abs/2402.03885
|
117 |
+
- **Demo:** https://github.com/moment-timeseries-foundation-model/moment/tree/main/tutorials
|
118 |
+
|
119 |
+
|
120 |
+
## Environmental Impact
|
121 |
+
|
122 |
+
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
|
123 |
+
|
124 |
+
We train multiple models over many days resulting in significant energy usage and a sizeable carbon footprint. However, we hope that releasing our models will ensure that future time-series modeling efforts are quicker and more efficient, resulting in lower carbon emissions.
|
125 |
+
|
126 |
+
We use the Total Graphics Power (TGP) to calculate the total power consumed for training MOMENT models, although the total power consumed by the GPU will likely vary a little based on the GPU utilization while training our model. Our calculations do not account for power demands from other sources of our compute. We use 336.566 Kg C02/MWH as the standard value of CO2 emission per megawatt hour of energy consumed for [Pittsburgh](https://emissionsindex.org/).
|
127 |
+
|
128 |
+
- **Hardware Type:** NVIDIA RTX A6000 GPU
|
129 |
+
- **GPU Hours:** 89
|
130 |
+
- **Compute Region:** Pittsburgh, USA
|
131 |
+
- **Carbon Emission (tCO2eq):**
|
132 |
+
|
133 |
+
#### Hardware
|
134 |
+
|
135 |
+
All models were trained and evaluated on a computing cluster consisting of 128 AMD EPYC 7502 CPUs, 503 GB of RAM, and 8 NVIDIA RTX A6000 GPUs each with 49 GiB RAM. All MOMENT variants were trained on a single A6000 GPU (with any data or model parallelism).
|
136 |
+
|
137 |
+
## Citation
|
138 |
+
|
139 |
+
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
|
140 |
+
|
141 |
+
**BibTeX:**
|
142 |
+
If you use MOMENT please cite our paper:
|
143 |
+
|
144 |
+
```bibtex
|
145 |
+
@inproceedings{goswami2024moment,
|
146 |
+
title={MOMENT: A Family of Open Time-series Foundation Models},
|
147 |
+
author={Mononito Goswami and Konrad Szafer and Arjun Choudhry and Yifu Cai and Shuo Li and Artur Dubrawski},
|
148 |
+
booktitle={International Conference on Machine Learning},
|
149 |
+
year={2024}
|
150 |
+
}
|
151 |
+
```
|
152 |
+
|
153 |
+
**APA:**
|
154 |
+
|
155 |
+
Goswami, M., Szafer, K., Choudhry, A., Cai, Y., Li, S., & Dubrawski, A. (2024).
|
156 |
+
MOMENT: A Family of Open Time-series Foundation Models. In International Conference on Machine Learning. PMLR.
|
config.json
ADDED
@@ -0,0 +1,38 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{"task_name": "reconstruction",
|
2 |
+
"model_name": "MOMENT",
|
3 |
+
"transformer_type": "encoder_only",
|
4 |
+
"d_model": null,
|
5 |
+
"seq_len": 512,
|
6 |
+
"patch_len": 8,
|
7 |
+
"patch_stride_len": 8,
|
8 |
+
"device": "cpu",
|
9 |
+
"transformer_backbone": "google/flan-t5-base",
|
10 |
+
"model_kwargs": {},
|
11 |
+
"t5_config": {
|
12 |
+
"architectures": [
|
13 |
+
"T5ForConditionalGeneration"
|
14 |
+
],
|
15 |
+
"d_ff": 2048,
|
16 |
+
"d_kv": 64,
|
17 |
+
"d_model": 768,
|
18 |
+
"decoder_start_token_id": 0,
|
19 |
+
"dropout_rate": 0.1,
|
20 |
+
"eos_token_id": 1,
|
21 |
+
"feed_forward_proj": "gated-gelu",
|
22 |
+
"initializer_factor": 1.0,
|
23 |
+
"is_encoder_decoder": true,
|
24 |
+
"layer_norm_epsilon": 1e-06,
|
25 |
+
"model_type": "t5",
|
26 |
+
"n_positions": 512,
|
27 |
+
"num_decoder_layers": 12,
|
28 |
+
"num_heads": 12,
|
29 |
+
"num_layers": 12,
|
30 |
+
"output_past": true,
|
31 |
+
"pad_token_id": 0,
|
32 |
+
"relative_attention_max_distance": 128,
|
33 |
+
"relative_attention_num_buckets": 32,
|
34 |
+
"tie_word_embeddings": false,
|
35 |
+
"use_cache": true,
|
36 |
+
"vocab_size": 32128
|
37 |
+
}
|
38 |
+
}
|
model.safetensors
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:1a436826ffe618273ec62b9656dc4cab8edc470364f104e90542a4ebc14fb825
|
3 |
+
size 453940120
|
pytorch_model.bin
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:23c3d65bbb6dcd323352029e9fbe4ee3a3da0fff55b45ee4e00f38fff4e9bfb9
|
3 |
+
size 453978525
|