此模型为字符切分版的ernie1.0(即对非whitespace的字符按个切分),去除长度超过1的token的tokenizer和模型。

Downloads last month
100
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.