Inference API is not working
#1
by
xianbao
HF staff
- opened
I also get the same error: "The model_type 'ernie_m' is not recognized. It could be a bleeding edge model, or incorrect"
Ah thanks
@xianbao
@jjoergensen
for pointing this out, it's because it's "bleeding edge", meaning that it's included in transformers==4.27, which is not released yet. To be able to run the model before the release of 4.27, you need to install transformers from source, i.e. pip install git+https://github.com/huggingface/transformers
. the inference widget should work once 4.27 is officially released. I'm adding an explanation for this in the model card
MoritzLaurer
changed discussion status to
closed
MoritzLaurer
changed discussion status to
open
Ah and it requires sentencepiece, so you also need to install pip install sentencepiece