Datasets:

Modalities:
Tabular
Formats:
csv
Libraries:
Datasets
pandas
License:

ssl_oli_metadata update

#3
No description provided.
crinistad changed pull request status to closed
crinistad changed pull request status to open
TorchGeo org

It looks like the list of files is still incomplete. There are only 51278 locations when there should be 250K. The others will require computing the x,y (lon,lat) coordinates of all image centroids and redownloading the metadata. I'm still okay with merging this PR as is since it uncorrupts the file, but for full reproducibility, we really should have all metadata.

oh I din realise. I will add all the other metadata today. Maybe we can hold it open for today as well. Additionally, because I am doing it for this - I want to double check for all the other metadata files because metadata is generally very very small in some folders. How would you recommend checking this?

TorchGeo org

I was just using:

> ls -1f metadata/ssl4eo_l_oli_sr/ | wc -l
   51280

but it's probably better to also check subdirectories to make sure all 4 times are available.

yeah -- I am using this as well.. I will write a script -- thank you

Hi - i am trying to run this but it's not running . Can you please tell the parameters that were passed in to the scrip to generate reproducible metadata -- there is nowhere to check this. I found some comments in the file but this is TOA

Example1: download Landsat-8, match pre-sampled locations

python download_ssl4eo.py
--save-path ./data
--collection LANDSAT/LC08/C02/T1_TOA
--meta-cloud-name CLOUD_COVER
--cloud-pct 20
--dates 2021-12-21 2021-09-22 2021-06-21 2021-03-20
--radius 1980
--bands B1 B2 B3 B4 B5 B6 B7 B8 B9 B10 B11
--dtype float32
--num-workers 8
--log-freq 100
--match-file ./data/sampled_locations.csv
--indices-range 0 250000

TorchGeo org

i am trying to run this but it's not running

You're going to have to give me a more detailed error message than this.

Can you please tell the parameters that were passed in to the scrip

See https://github.com/microsoft/torchgeo/blob/main/experiments/ssl4eo/landsat/download_oli_sr.sh. https://github.com/microsoft/torchgeo/tree/main/experiments/ssl4eo/landsat#readme gives explanations of the important variables you may need to change.

Hi- Sorry for the unclear wording. The current entries (the actual file oli_sr.sh already has hard coded values) will not generate the whole metadata that was already there was oli_sr. I get atleast 7 location out of 10 where data is not available (without changing anything in the .sh file). Additionally, I read the Readme. It is not talking about reproducibility currently, but rather telling what one can do to create my own data. I don’t want to create own data- I just need to fill in the missing metadata for the existing files. I think for that- I need the actual value that was used for generating the original dataset.

Hope this helps.

All location counts

ls -1f ssl4eo_l_oli_sr/data | wc -l
157765
ls -1f ssl4eo_l_oli_sr/metadata | wc -l
51280

ls -1f ssl4eo_l_oli_tirs_toa/data | wc -l
250002


ls -1f ssl4eo_l_oli_tirs_toa/metadata | wc -l
196824

ls -1f ssl4eo_l_etm_sr/metadata | wc -l
250002
ls -1f ssl4eo_l_etm_sr/data | wc -l
250002

ls -1f ssl4eo_l_etm_toa/data | wc -l
250002
ls -1f ssl4eo_l_etm_toa/metadata | wc -l
105351
ls -1f ssl4eo_l_tm_toa/data | wc -l
250002
ls -1f ssl4eo_l_tm_toa/metadata | wc -l
250002


TorchGeo org

I think for that- I need the actual value that was used for generating the original dataset.

Yes, you can get the sample locations by saving the x,y coordinates of each image centroid to a CSV file.

ls -1f ssl4eo_l_oli_sr/data | wc -l
157765

Wait, are we missing some images? That's not good...

It looks like we are also missing some metadata for oli_tirs_toa and etm_toa. Sorry about that. If you need the metadata for your application, I can try to redownload it when I have a chance.

Yes, you can get the sample locations by saving the x,y coordinates of each image centroid to a CSV file.

I don’t think it just depends on x,y - the date ranges are actually very important for reproducing this in my opinion, but I suppose it very manual work- like you first get the date and lat long of the extracted image and then get the relevant metadata. I suppose it might be easier to just redownload the whole thing, albeit time consuming

TorchGeo org

As long as you use the same hyperparameters (e.g., year, cloud_pct) as listed in https://github.com/microsoft/torchgeo/blob/main/experiments/ssl4eo/landsat/download_oli_sr.sh, it should give you the same metadata. But it's probably faster to modify the download script to grab both the location and date from the directory name and image file and only query those.

It is downloading now- let’s see how long it takes. I din change anything for this one. It’s weird it still gives that some locations aren’t found for some date ranges but maybe those tifs were also not downloaded.

sorry -- I am now getting - batch response: You have read access but not the required permissions for this operation
error: failed to push some refs to 'https://huggingface.co/datasets/torchgeo/ssl4eo_l'

do you have idea why that is?

TorchGeo org

It looks like you're trying to push directly to our main branch. You need to instead push to your PR branch before we review and merge. I don't have a ton of experience with HuggingFace PRs, but maybe https://huggingface.co/docs/hub/en/repositories-pull-requests-discussions will help.

No - I am not pushing to the main branch - I am on my branch.. the pr branch - i can try again. maybe something went weird

Ready to merge
This branch is ready to get merged automatically.

Sign up or log in to comment