GGUF
llama.cpp
Inference Endpoints

Error: pull model manifest ollama

#4
by nprime496 - opened

Hi, I try to use the model with Ollama according to https://huggingface.co/docs/hub/ollama but it is not working.

command :
ollama run hf.co/google/gemma-7b-GGUF

result:

pulling manifest 
Error: pull model manifest: Get "Authentication%!r(MISSING)equired?nonce=VZsAKGnamjCp-giE0xXapw&scope=&service=&ts=1729508446": unsupported protocol scheme ""

Sign up or log in to comment