translation_llama_7b / original_params.json
Respair's picture
Upload 7 files
05342e8 verified
raw
history blame contribute delete
211 Bytes
{
"dim": 4096,
"n_layers": 32,
"n_heads": 32,
"n_kv_heads": 8,
"vocab_size": 128256,
"multiple_of": 1024,
"ffn_dim_multiplier": 1.3,
"norm_eps": 1e-05,
"rope_theta": 500000.0
}