llama3.1_6.5b_mergkit_prunme / mergekit_config.yml
thucdangvan020999's picture
Upload folder using huggingface_hub
eeeac12 verified
raw
history blame
216 Bytes
dtype: bfloat16
merge_method: passthrough
slices:
- sources:
- layer_range: [0, 22]
model: meta-llama/Meta-Llama-3-8B-Instruct
- sources:
- layer_range: [29, 32]
model: meta-llama/Meta-Llama-3-8B-Instruct