Clean refs/convert/duckdb

#4
by zinc75 - opened
Laboratoire de Mécanique des Structures et des Systèmes Couplés org

cc @albertvillanova @lhoestq @severo

We are in the process of re-uploading the dataset in its final form. The main branch has a substantial commit history with over 1,950 commits, as we tested various combinations of subsets in earlier preliminary versions.

Once the upload is complete, we plan to use super_squash_history to clean the main branch. From my understanding, after running the super_squash_history command, only the latest commit will remain, and there will be no references to the old subsets from the preliminary versions. However, I have noticed that the refs/convert/duckdb directory contains numerous subfolders corresponding to outdated subsets that we no longer use.

I have reviewed the documentation but could not find any instructions on how to completely clean this auto-generated branch, so only the subfolders corresponding to the latest commit subsets remain.

Could you please advise on how to handle this gracefully? We prefer not to delete and recreate the repository, as this would result in losing over 100 access requests that we want to retain.

Thanks for your help.

Laboratoire de Mécanique des Structures et des Systèmes Couplés org

cc @albertvillanova @lhoestq @severo

There are also more than 3000 commits in the auto-generated refs/convert/parquet branch .

Is it safe to use super_squash_historyon this branch to remove all commits corresponding to old versions of the dataset / subsets ?

Best regards,

Laboratoire de Mécanique des Structures et des Systèmes Couplés org
edited 18 days ago

FYI, one more comment: we tried using super_squash_history on the main branch, and it completely disrupted the push_to_hub functionality.

Every time we attempted to push the new version of the dataset to the squashed main branch, we encountered the following error:

403 Forbidden: Access Denied.
Cannot access content at: https://huggingface.co/datasets/Cnam-LMSSC/vibravox.git/info/lfs/objects/batch. 
If you are trying to create or update content, make sure you have a token with the `write` role.

This is puzzling since it's not a token issue (we managed to push a dummy dataset to this repo, including Git-LFS files, but not the actual vibravox dataset files). We also tried pushing to a new branch, thinking the problem was specific to the main branch, but the same error occurred.

Our current workaround is to create another dataset (vibravox2) on the hub, push our files there, duplicate the README.md from vibravox, delete the existing vibravox repo, and then rename vibravox2 to vibravox. The downside is that we would lose the 100+ pending access requests for the gated dataset, discussions, and likes. The upside is that the refs/convert/* branches would be clean again.

Given that we've tried numerous workarounds and read many related issues on GitHub with no successful solution, it seems the problem might be server-side rather than on our end.

Could you investigate and possibly fix this on your side to allow us to push our dataset to vibravox, or should we proceed with our cumbersome workaround?

Best regards,

cc @albertvillanova @lhoestq @severo @polinaeterna

Laboratoire de Mécanique des Structures et des Systèmes Couplés org
edited 17 days ago

Surprisingly, this worked:

from datasets import load_dataset

dataset_speech_noisy = load_dataset("Cnam-LMSSC/vibravox2", "speech_noisy")
dataset_speech_noisy.push_to_hub("Cnam-LMSSC/vibravox", "speech_noisy")

But the classic upload of local files to "Cnam-LMSSC/vibravox" still produces the forbidden access error. (while the same code works for any other dataset like "Cnam-LMSSC/vibravox2").

Laboratoire de Mécanique des Structures et des Systèmes Couplés org

Hi ! No it's not the same, the github error you mention is different (related to LFS while your is related to permissions).

Have you tried to re-create a token just in case the current one has an issue ?

Laboratoire de Mécanique des Structures et des Systèmes Couplés org

Thanks for your reply, but yes, I already tried that. @zinc75 mentioned this issue because the error only appears when pushing lfs files.

Ok, we are investigating internally

We found that some files that were not deleted correctly and removed them, you can try uploading the dataset again now :)

Laboratoire de Mécanique des Structures et des Systèmes Couplés org

It's working well now, thanks ! 👌

Sign up or log in to comment