Hermes-2-Pro-11B / mergekit_config.yml
mattshumer's picture
Upload folder using huggingface_hub
34368a3 verified
raw
history blame
906 Bytes
slices:
- sources:
- layer_range: [0, 5]
model: NousResearch/Hermes-2-Pro-Mistral-7B
- sources:
- layer_range: [3, 8]
model: NousResearch/Hermes-2-Pro-Mistral-7B
- sources:
- layer_range: [6, 11]
model: NousResearch/Hermes-2-Pro-Mistral-7B
- sources:
- layer_range: [9, 14]
model: NousResearch/Hermes-2-Pro-Mistral-7B
- sources:
- layer_range: [12, 17]
model: NousResearch/Hermes-2-Pro-Mistral-7B
- sources:
- layer_range: [15, 20]
model: NousResearch/Hermes-2-Pro-Mistral-7B
- sources:
- layer_range: [18, 23]
model: NousResearch/Hermes-2-Pro-Mistral-7B
- sources:
- layer_range: [21, 26]
model: NousResearch/Hermes-2-Pro-Mistral-7B
- sources:
- layer_range: [24, 29]
model: NousResearch/Hermes-2-Pro-Mistral-7B
- sources:
- layer_range: [27, 32]
model: NousResearch/Hermes-2-Pro-Mistral-7B
merge_method: passthrough
dtype: bfloat16