ZeroXClem's picture
Upload folder using huggingface_hub
2e7e617 verified
|
raw
history blame
901 Bytes
---
license: apache-2.0
tags:
- merge
- mergekit
- lazymergekit
- Locutusque/StockQwen-2.5-7B
- allknowingroger/QwenSlerp8-7B
---
# ZeroXClem/Qwen-2.5-Aether-SlerpFusion-7B
ZeroXClem/Qwen-2.5-Aether-SlerpFusion-7B is a merge of the following models using [mergekit](https://github.com/cg123/mergekit):
* [Locutusque/StockQwen-2.5-7B](https://huggingface.co/Locutusque/StockQwen-2.5-7B)
* [allknowingroger/QwenSlerp8-7B](https://huggingface.co/allknowingroger/QwenSlerp8-7B)
## 🧩 Configuration
```yaml
slices:
- sources:
- model: Locutusque/StockQwen-2.5-7B
layer_range: [0, 28]
- model: allknowingroger/QwenSlerp8-7B
layer_range: [0, 28]
merge_method: slerp
base_model: Locutusque/StockQwen-2.5-7B
parameters:
t:
- filter: self_attn
value: [0, 0.5, 0.3, 0.7, 1]
- filter: mlp
value: [1, 0.5, 0.7, 0.3, 0]
- value: 0.5
dtype: bfloat16
```