xLAM-1b-fc-r-ct2-int8 / config.json
jncraton's picture
Upload folder using huggingface_hub
24882f1 verified
raw
history blame contribute delete
172 Bytes
{
"bos_token": "<\uff5cbegin\u2581of\u2581sentence\uff5c>",
"eos_token": "<|EOT|>",
"layer_norm_epsilon": 1e-06,
"multi_query_attention": true,
"unk_token": ""
}