gpt2-finetuned-greek-small / tokenizer_config.json
lighteternal's picture
Initial commit
496bab2
raw
history blame contribute delete
92 Bytes
{"pad_token": "<|endoftext|>", "special_tokens_map_file": null, "full_tokenizer_file": null}