gpt-2-pretrained-26m / special_tokens_map.json
Granther's picture
Upload tokenizer
db4a38b verified
raw
history blame contribute delete
131 Bytes
{
"bos_token": "<|endoftext|>",
"eos_token": "<|endoftext|>",
"pad_token": "<|endoftext|>",
"unk_token": "<|endoftext|>"
}