banglabert_large_generator / tokenizer_config.json
abhik1505040's picture
Add model files
516615f
raw
history blame contribute delete
119 Bytes
{"do_lower_case": false, "tokenize_chinese_chars": false, "special_tokens_map_file": null, "full_tokenizer_file": null}