Text Generation
Transformers
PyTorch
Safetensors
bloom
Eval Results
text-generation-inference
Inference Endpoints
bloom-3b / tokenizer_config.json
ybelkada
Change to correct `tokenizer_class` (#3)
5d7ac41
raw
history blame
199 Bytes
{"unk_token": "<unk>", "eos_token": "</s>", "bos_token": "<s>", "pad_token": "<pad>", "name_or_path": "bigscience/tokenizer", "special_tokens_map_file": null, "tokenizer_class": "BloomTokenizerFast"}