Joey Öhman
add tokenizer
5c57b32
raw
history blame contribute delete
181 Bytes
{"special_tokens_map_file": "pretrained_model_hf_large_165K/special_tokens_map.json", "name_or_path": "pretrained_model_hf_large_165K", "tokenizer_class": "PreTrainedTokenizerFast"}