"Model is overloaded, please wait for a bit"
#4
by
yongzx
- opened
I ran into this issue (similar to https://huggingface.co/bigscience/bloom/discussions/70) when using the inference API.
I get this message as well even though I wait for 10s of minutes.
christopher
changed discussion status to
closed