Cannot download the model with huggingface-cli

#11
by lulmer - opened

Hello,
I have trouble downloading this model using the CLI

huggingface-cli download meta-llama/Meta-Llama-3.2-3B-Instruct

Gives me the following error :

Traceback (most recent call last):
  File "/home/vscode/py-env/lib/python3.11/site-packages/huggingface_hub/utils/_errors.py", line 304, in hf_raise_for_status
    response.raise_for_status()
  File "/home/vscode/py-env/lib/python3.11/site-packages/requests/models.py", line 1024, in raise_for_status
    raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 404 Client Error: Not Found for url: https://huggingface.co/api/models/meta-llama/Meta-Llama-3.2-3B-Instruct/revision/main

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/vscode/py-env/bin/huggingface-cli", line 8, in <module>
    sys.exit(main())
             ^^^^^^
  File "/home/vscode/py-env/lib/python3.11/site-packages/huggingface_hub/commands/huggingface_cli.py", line 51, in main
    service.run()
  File "/home/vscode/py-env/lib/python3.11/site-packages/huggingface_hub/commands/download.py", line 146, in run
    print(self._download())  # Print path to downloaded files
          ^^^^^^^^^^^^^^^^
  File "/home/vscode/py-env/lib/python3.11/site-packages/huggingface_hub/commands/download.py", line 180, in _download
    return snapshot_download(
           ^^^^^^^^^^^^^^^^^^
  File "/home/vscode/py-env/lib/python3.11/site-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
  File "/home/vscode/py-env/lib/python3.11/site-packages/huggingface_hub/_snapshot_download.py", line 233, in snapshot_download
    raise api_call_error
  File "/home/vscode/py-env/lib/python3.11/site-packages/huggingface_hub/_snapshot_download.py", line 164, in snapshot_download
    repo_info = api.repo_info(repo_id=repo_id, repo_type=repo_type, revision=revision, token=token)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/vscode/py-env/lib/python3.11/site-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
  File "/home/vscode/py-env/lib/python3.11/site-packages/huggingface_hub/hf_api.py", line 2491, in repo_info
    return method(
           ^^^^^^^
  File "/home/vscode/py-env/lib/python3.11/site-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
  File "/home/vscode/py-env/lib/python3.11/site-packages/huggingface_hub/hf_api.py", line 2301, in model_info
    hf_raise_for_status(r)
  File "/home/vscode/py-env/lib/python3.11/site-packages/huggingface_hub/utils/_errors.py", line 352, in hf_raise_for_status
    raise RepositoryNotFoundError(message, response) from e
huggingface_hub.utils._errors.RepositoryNotFoundError: 404 Client Error. (Request ID: Root=1-66f54b74-57e3a2477a203b7c3805d63f;0b7ae875-1169-4a3b-9c8d-467c511a1230)

Repository Not Found for url: https://huggingface.co/api/models/meta-llama/Meta-Llama-3.2-3B-Instruct/revision/main.
Please make sure you specified the correct `repo_id` and `repo_type`.
If you are trying to access a private or gated repo, make sure you are authenticated.

Are you authenticated with a User Access Token?

@supportend Yes, I can download other models with no issues

Llama-Models are special, because you have "to agree to share your contact information" and use a User Access Token, to verify, you have done it - to access the model files.

You saved the tiken in a envionment variable? Because i don't see options like login or login --token in your input.

See chapter huggingface-cli login here:

https://huggingface.co/docs/huggingface_hub/guides/cli#huggingface-cli-login

What I've done is

huggingface-cli login

Then the terminal asks for my token and I just copy paste it in the console. By the way I made sure having accepted the license for the 3.2 family as well.

I also previously accepted the license for Llama3.1 models and have no issue downloading them.

i have same error too , first time i loged huggingface-cli and paste my token ı downloaded the llama 3.2 3b after ı tried ı got [error 404 cannto ](ValueError: Error raised by inference API HTTP code: 404, {"error":"model "meta-llama/Llama-3.2-1B" not found, try pulling it first"}).

To download meta llama models, go to "Files and Versions" tab in the model page, expand Review and access to submit the form. Meta will grant you access for their models.
Screenshot 2025-02-06 at 2.44.33 PM.png
Screenshot 2025-02-06 at 2.44.22 PM.png

Sign up or log in to comment