tgpt-opt-nano / tokenizer /tokenizer_config.json
lixiangchun's picture
initial upload
56377c9 verified
raw
history blame contribute delete
183 Bytes
{"model_max_length": 512, "pad_token": "[PAD]", "unk_token": "[UNK]", "cls_token": "[CLS]", "sep_token": "[SEP]", "mask_token": "[MASK]", "tokenizer_class": "PreTrainedTokenizerFast"}