gastronomia_para_to2 / special_tokens_map.json
jucendrero's picture
add tokenizer
dd19c4d
raw
history blame contribute delete
350 Bytes
{"bos_token": "<|endoftext|>", "eos_token": "<|endoftext|>", "unk_token": "<|endoftext|>", "pad_token": "[PAD]", "additional_special_tokens": ["<INPUT_START>", "<NEXT_INPUT>", "<INPUT_END>", "<TITLE_START>", "<TITLE_END>", "<INGR_START>", "<NEXT_INGR>", "<INGR_END>", "<INSTR_START>", "<NEXT_INSTR>", "<INSTR_END>", "<RECIPE_START>", "<RECIPE_END>"]}