Inference API Doesn't Work
#4
by
DrNicefellow
- opened
I tried the text generation with the Inference API from the Model card. It says wrong transformers version but it looks correct: 4.40.0.dev0. But somehow the inference still doesn't work.
I've updated the version in the config json as a hopeful fix. Maybe it will start working in a couple of hours?
I've updated the version in the config json as a hopeful fix. Maybe it will start working in a couple of hours?
Still doesn't work. Maybe it is the Inference API is using an old version and doesn't use the version specified in config.json.
It's working today at least. I'm guessing it's as you said: the Inference API was using an old version.
DrNicefellow
changed discussion status to
closed