vicuna2 / tokenizer_config.json
chavinlo's picture
Training in progress, step 500
1668fe1
raw
history blame contribute delete
337 Bytes
{
"bos_token": "",
"eos_token": "",
"model_max_length": 1024,
"padding_side": "right",
"special_tokens_map_file": "/home/ubuntu/.cache/huggingface/hub/models--decapoda-research--llama-13b-hf/snapshots/438770a656712a5072229b62256521845d4de5ce/special_tokens_map.json",
"tokenizer_class": "LLaMATokenizer",
"unk_token": ""
}