ko-barTNumText / tokenizer_config.json
lIlBrother's picture
Init: Model Upload
9ba3613
raw
history blame contribute delete
234 Bytes
{
"name_or_path": "/data2/bart/temp_workspace/nlp/models/kobart-base-v2",
"special_tokens_map_file": "/data2/bart/temp_workspace/nlp/models/kobart-base-v2/special_tokens_map.json",
"tokenizer_class": "PreTrainedTokenizerFast"
}