Question about convert-hf-to-gguf-update.py

#27
by SolidSnacke - opened

When starting it I need to enter . Does this mean I need to enter my Access Tokens?

Because I did and when trying to download llama-bpe I get an error - Failed to download file. Status code: 403

I was able to download the files, only it seems to happen after I accessed the Meta repository. But I'm not sure.

SolidSnacke changed discussion status to closed
AetherArchitectural org
edited May 4, 2024

Yes you use ...-update.py <hf-read-token>.

Some files will give you a 403 error - they don't exist in the remote - but the ones we need should be downloaded fine inside models\tokenizers\llama-bpe.

Replace the files in the model folder with the ones in the llama-bpe folder.

The thing is that I was able to download them only when I got access to Meta-Llama-3. That is, before I did not gain access to their repository, I was given a 403 error - that is, access was denied.
I just don’t know whether this should be included in the README or not, that to download these files you need access to the Meta repository.

AetherArchitectural org
edited May 6, 2024

Huh, I think this should be included. I never thought about that. Lemme see if I have access to them...

Yeah, indeed. Oh well then!

Because, for example, I cannot get these files from the token of this account. But from the second account, on which I was able to access Meta-Llama-3, I can get these files.

AetherArchitectural org
edited May 6, 2024

Added to the Warning. All the people I talked about so far I think just were too into it and already had access, haha. Thanks for the reminder.

Not sure if I would get in trouble for hosting these myself...

Let it be just a warning. There's nothing complicated about it.

AetherArchitectural org
edited May 7, 2024

It's inside the [!WARNING] that was already there, just added more context about the repos access to fetch the files. Should be alright.

Sign up or log in to comment