annezj commited on
Commit
b0f4be9
1 Parent(s): 6bce0b5

Upload README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -4
README.md CHANGED
@@ -1,6 +1,3 @@
1
- ---
2
- license: apache-2.0
3
- ---
4
  # Model Card for granite-geospatial-uki
5
 
6
  The granite-geospatial-uki model is a transformer-based geospatial foundation model trained on HLS L30 multispectral satellite imagery and Sentinel-1 synthetic aperture radar (SAR) backscatter over the United Kingdom and Ireland. The model consists of a self-supervised encoder developed with a ViT architecture and Masked AutoEncoder (MAE) learning strategy, with an MSE loss function and follows the same architecture as [Prithvi-EO](https://huggingface.co/collections/ibm-nasa-geospatial/prithvi-for-earth-observation-6740a7a81883466bf41d93d6).
@@ -14,7 +11,7 @@ The granite-geospatial-uki model is a transformer-based geospatial foundation mo
14
 
15
  ## How to Get Started with the Model
16
 
17
- An example of fine-tuning the model for image segmentation using Terratorch for flood detection in the UK and Ireland can be found [here](./notebooks/2_fine_tuning.ipynb).
18
 
19
  ## Pre-training
20
 
 
 
 
 
1
  # Model Card for granite-geospatial-uki
2
 
3
  The granite-geospatial-uki model is a transformer-based geospatial foundation model trained on HLS L30 multispectral satellite imagery and Sentinel-1 synthetic aperture radar (SAR) backscatter over the United Kingdom and Ireland. The model consists of a self-supervised encoder developed with a ViT architecture and Masked AutoEncoder (MAE) learning strategy, with an MSE loss function and follows the same architecture as [Prithvi-EO](https://huggingface.co/collections/ibm-nasa-geospatial/prithvi-for-earth-observation-6740a7a81883466bf41d93d6).
 
11
 
12
  ## How to Get Started with the Model
13
 
14
+ An example of fine-tuning the model for image segmentation using Terratorch for flood detection in the UK and Ireland can be found [here](https://github.com/ibm-granite/geospatial/blob/main/uki-flooddetection/notebooks/2_fine_tuning.ipynb).
15
 
16
  ## Pre-training
17