gpt2-med-finetuned-wikitext103 / special_tokens_map.json
urialon's picture
add tokenizer
2021215
raw
history blame contribute delete
90 Bytes
{"bos_token": "<|endoftext|>", "eos_token": "<|endoftext|>", "unk_token": "<|endoftext|>"}