t5-large-QuestionGeneration / tokenizer_config.json
Sehong's picture
add tokenizer
83facfa
raw
history blame
116 Bytes
{
"name_or_path": "t5-large",
"special_tokens_map_file": null,
"tokenizer_class": "PreTrainedTokenizerFast"
}