Transformers
Inference Endpoints
historical-irish-tokenizer-bpe / tokenizer_config.json
ancatmara's picture
Upload 3 files
10520ec verified
raw
history blame
235 Bytes
{
"clean_up_tokenization_spaces": true,
"model_max_length": 64,
"special_tokens": [
"<s>",
"<pad>",
"</s>",
"<unk>",
"<mask>",
"<true>",
"<false>"
],
"tokenizer_class": "PreTrainedTokenizerFast"
}