Lelantos-Maid-DPO-7B / mergekit_config.yml
SanjiWatsuki's picture
Upload folder using huggingface_hub
05510d3 verified
raw
history blame contribute delete
315 Bytes
slices:
- sources:
- model: SanjiWatsuki/Lelantos-DPO-7B
layer_range: [0, 32]
- model: NeverSleep/Noromaid-7B-0.4-DPO
layer_range: [0, 32]
merge_method: slerp
base_model: NeverSleep/Noromaid-7B-0.4-DPO
parameters:
t:
- value: 0.5 # fallback for rest of tensors
dtype: bfloat16