fgcnpact2 / config.json
Mufcruz's picture
Update config.json
0ce81bd verified
raw
history blame contribute delete
170 Bytes
{
"model_type": "llama",
"pipeline_tag": "feature-extraction",
"hidden_size": 4096,
"num_attention_heads": 32,
"num_hidden_layers": 32,
"vocab_size": 32000
}