Graph Machine Learning
AnemoI
English
anaprietonem commited on
Commit
5a3ec83
2 Parent(s): 8dc1f98 aea7c51

Merge branch 'main' of https://huggingface.co/ecmwf/aifs-single

Browse files
Files changed (1) hide show
  1. README.md +18 -15
README.md CHANGED
@@ -1,5 +1,5 @@
1
  ---
2
- license: cc-by-sa-4.0
3
  metrics:
4
  - mse
5
  pipeline_tag: graph-ml
@@ -41,7 +41,8 @@ and direct observational data.
41
 
42
  - **Developed by:** ECMWF
43
  - **Model type:** Encoder-processor-decoder model
44
- - **License:** CC BY-SA 4.0
 
45
 
46
  ### Model Sources
47
 
@@ -57,22 +58,24 @@ and direct observational data.
57
  To generate a new forecast using AIFS, you can use [anemoi-inference](https://github.com/ecmwf/anemoi-inference). In the [following notebook](run_AIFS_v0_2_1.ipynb), a
58
  step-by-step workflow is specified to run the AIFS using the HuggingFace model:
59
 
60
- - Install the required packages
61
- - Select a date
62
- - Get the data from the [ECMWF Open Data API](https://www.ecmwf.int/en/forecasts/datasets/open-data)
63
- - Get input fields
64
- - Add the single levels fields and pressure levels fields
65
- - Convert geopotential height into greopotential
66
- - Create the initial state
67
- - Create a runner
68
- - Run the forecast
69
- - Plot a field
 
 
 
 
70
 
71
 
72
  🚨 **Note** we train AIFS using `flash_attention` (https://github.com/Dao-AILab/flash-attention).
73
- There are currently some issues when trying to install flash attention with the latest PyTorch version 2.5 and CUDA 12.4 (https://github.com/Dao-AILab/flash-attention/issues/1330).
74
- For that reason, we recommend you install PyTorch 2.4.
75
- Additonally the use of 'Flash Attention' package also imposes certain requirements in terms of software and hardware. Those can be found under #Installation and Features in https://github.com/Dao-AILab/flash-attention
76
 
77
  🚨 **Note** the `aifs_single_v0.2.1.ckpt` checkpoint just contains the model’s weights.
78
  That file does not contain any information about the optimizer states, lr-scheduler states, etc.
 
1
  ---
2
+ license: cc-by-4.0
3
  metrics:
4
  - mse
5
  pipeline_tag: graph-ml
 
41
 
42
  - **Developed by:** ECMWF
43
  - **Model type:** Encoder-processor-decoder model
44
+ - **License:** These model weights are published under a Creative Commons Attribution 4.0 International (CC BY 4.0).
45
+ To view a copy of this licence, visit https://creativecommons.org/licenses/by/4.0/
46
 
47
  ### Model Sources
48
 
 
58
  To generate a new forecast using AIFS, you can use [anemoi-inference](https://github.com/ecmwf/anemoi-inference). In the [following notebook](run_AIFS_v0_2_1.ipynb), a
59
  step-by-step workflow is specified to run the AIFS using the HuggingFace model:
60
 
61
+ 1. **Install Required Packages and Imports**
62
+ 2. **Retrieve Initial Conditions from ECMWF Open Data**
63
+ - Select a date
64
+ - Get the data from the [ECMWF Open Data API](https://www.ecmwf.int/en/forecasts/datasets/open-data)
65
+ - Get input fields
66
+ - Add the single levels fields and pressure levels fields
67
+ - Convert geopotential height into greopotential
68
+ - Create the initial state
69
+ 3. **Load the Model and Run the Forecast**
70
+ - Download the Model's Checkpoint from Hugging Face
71
+ - Create a runner
72
+ - Run the forecast using anemoi-inference
73
+ 4. **Inspect the generated forecast**
74
+ - Plot a field
75
 
76
 
77
  🚨 **Note** we train AIFS using `flash_attention` (https://github.com/Dao-AILab/flash-attention).
78
+ The use of 'Flash Attention' package also imposes certain requirements in terms of software and hardware. Those can be found under #Installation and Features in https://github.com/Dao-AILab/flash-attention
 
 
79
 
80
  🚨 **Note** the `aifs_single_v0.2.1.ckpt` checkpoint just contains the model’s weights.
81
  That file does not contain any information about the optimizer states, lr-scheduler states, etc.