🚩 Report

#106
by tasmay - opened

This inference API of this model is not working. "Model not loaded yet" error. Please resolve this.

Hey ... try passing the token also into the AutoTokenizer, not only on AutoModelForCausalLM:

AutoTokenizer.from_pretrained(model_id, token = '<your token>')

AutoModelForCausalLM.from_pretrained(model_id, token = '<your token>')

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment