BigCodeLlama-92b / mergekit_config.yml
nisten's picture
Upload folder using huggingface_hub
1c120ff verified
raw
history blame contribute delete
254 Bytes
dtype: bfloat16
merge_method: passthrough
slices:
- sources:
- layer_range: [0, 69]
model:
model:
path: ../CodeLlama-70b-Instruct-hf
- sources:
- layer_range: [42, 80]
model:
model:
path: ../CodeLlama-70b-Python-hf