Why is there an error on the inference widget?
#46
by
sarasc
- opened
This has been happening for the last couple days. unable to use the widget or inference api locally. This is the error:
The model bigscience/bloom-7b1 is too large to be loaded automatically (14GB > 10GB). For commercial use please use PRO spaces (https://huggingface.co/spaces) or Inference Endpoints (https://huggingface.co/inference-endpoints).
christopher
changed discussion status to
closed