MobileLLM-1B-MNN / llm_config.json
zhaode's picture
Upload folder using huggingface_hub
11a1a18 verified
raw
history blame contribute delete
220 Bytes
{
"hidden_size": 1280,
"layer_nums": 54,
"attention_mask": "float",
"key_value_shape": [
2,
1,
0,
5,
64
],
"prompt_template": "%s",
"is_visual": false
}