Nicola De Cao
adding model and tokenizer
8a14b8b
raw
history blame
183 Bytes
{"model_max_length": 512, "unk_token": "[UNK]", "cls_token": "[CLS]", "sep_token": "[SEP]", "pad_token": "[PAD]", "mask_token": "[MASK]", "tokenizer_class": "PreTrainedTokenizerFast"}