prokbert-mini-long / tokenizer_config.json
Ligeti Balázs
ProkBERT mini k6s2 upload
fe7f825
raw
history blame contribute delete
164 Bytes
{
"clean_up_tokenization_spaces": true,
"cls_token": "[CLS]",
"model_max_length": 1000000000000000019884624838656,
"tokenizer_class": "ProkBERTTokenizer"
}