Error raised by inference API: Cannot override task for LLM models

#115
by subhayanwbgmail - opened

Suddenly starts getting this error without any code changes. It was working fine 2 days back.

Hi, did you figure it out? I also suddenly started getting this error.

Not yet, still getting the same error.

I am also getting this error. My code worked a few days ago.

My code was working earlier but getting this error now

I have the same issue

Suddenly starts getting this error without any code changes. It was working fine 2 days back.

yes

This comment has been hidden

I have the same issue , anyone solve it ?

Just updated langchain library to the latest

Did any one solve the issue?

i solved by changing huggingface_hub.interface_api file on 152 line i replaced api_url as as self.api_url = f"{INFERENCE_ENDPOINT}/models/{repo_id}"

Same problem was working and just stopped

You can alternatively override the client's api url.
E.g.

llm = HuggingFaceHub(repo_id='tiiuae/falcon-7b-instruct', huggingfacehub_api_token=huggingfacehub_api_token)
llm.client.api_url = 'https://api-inference.huggingface.co/models/tiiuae/falcon-7b-instruct'
llm.invoke('foo bar')

Great that also works @nij4t

I can solve it by updating the libraries:langchain and huggingface-hub

Can you share the command? Which version u updated?

@vaidehirao uninstall as previous verses and

pip install langchain and
pip install huggingface-hub

Sign up or log in to comment