gpt2-data-tokenizer / special_tokens_map.json
Sabrina1763's picture
Upload tokenizer
f395734
raw
history blame contribute delete
99 Bytes
{
"bos_token": "<|endoftext|>",
"eos_token": "<|endoftext|>",
"unk_token": "<|endoftext|>"
}