french_semantic / 0_Transformer /special_tokens_map.json
Sahajtomar's picture
First version of the french_semantic model and tokenizer.
f991aed
raw
history blame contribute delete
298 Bytes
{"bos_token": "<s>", "eos_token": "</s>", "unk_token": "<unk>", "sep_token": "</s>", "pad_token": "<pad>", "cls_token": "<s>", "mask_token": {"content": "<mask>", "single_word": false, "lstrip": true, "rstrip": false, "normalized": true}, "additional_special_tokens": ["<s>NOTUSED", "</s>NOTUSED"]}