Llama-3-dare_ties / mergekit_config.yml
birgermoell's picture
Upload folder using huggingface_hub
7f9fca9 verified
raw
history blame
257 Bytes
models:
- model: meta-llama/Meta-Llama-3-8B
- model: meta-llama/Meta-Llama-3-8B-Instruct
parameters:
density: 0.53
weight: 0.6
merge_method: dare_ties
base_model: meta-llama/Meta-Llama-3-8B
parameters:
int8_mask: true
dtype: bfloat16