Model not loaded on the server

#28
by divakaivan - opened

I am getting:
Model not loaded on the server: https://api-inference.huggingface.co/models/codellama/CodeLlama-34b-Instruct-hf. Please retry with a higher timeout (current: 120).
I used the model in my code 1 month ago and it worked. Checked now and model does not work :(
code: https://github.com/divakaivan/text2chart/blob/main/app.py
edit: I am new to hugging face so am not sure what is happening

Sign up or log in to comment