Bert_Finetuning_Text_Classification / special_tokens_map.json
omid-ebi's picture
Upload tokenizer
718543d verified
raw
history blame contribute delete
125 Bytes
{
"cls_token": "[CLS]",
"mask_token": "[MASK]",
"pad_token": "[PAD]",
"sep_token": "[SEP]",
"unk_token": "[UNK]"
}