--- license: apache-2.0 tags: - merge - mergekit - lazymergekit - bluuwhale/L3-SAO-MIX-8B-V1 - ProdeusUnity/Astral-Fusion-8b-v0.0 - cgato/L3-TheSpice-8b-v0.8.3 - Locutusque/Llama-3-Yggdrasil-2.0-8B --- # ZeroXClem/Llama-3-Yggdrasil-AstralSpice-8B ZeroXClem/Llama-3-Yggdrasil-AstralSpice-8B is a merge of the following models using [mergekit](https://github.com/cg123/mergekit): * [bluuwhale/L3-SAO-MIX-8B-V1](https://huggingface.co/bluuwhale/L3-SAO-MIX-8B-V1) * [ProdeusUnity/Astral-Fusion-8b-v0.0](https://huggingface.co/ProdeusUnity/Astral-Fusion-8b-v0.0) * [cgato/L3-TheSpice-8b-v0.8.3](https://huggingface.co/cgato/L3-TheSpice-8b-v0.8.3) * [Locutusque/Llama-3-Yggdrasil-2.0-8B](https://huggingface.co/Locutusque/Llama-3-Yggdrasil-2.0-8B) ## 🧩 Configuration ```yaml models: - model: bluuwhale/L3-SAO-MIX-8B-V1 parameters: density: 0.4 weight: 0.3 - model: ProdeusUnity/Astral-Fusion-8b-v0.0 parameters: density: 0.5 weight: 0.3 - model: cgato/L3-TheSpice-8b-v0.8.3 parameters: density: 0.5 weight: 0.2 - model: Locutusque/Llama-3-Yggdrasil-2.0-8B parameters: density: 0.5 weight: 0.2 merge_method: ties base_model: Locutusque/Llama-3-Yggdrasil-2.0-8B dtype: bfloat16 parameters: normalize: true out_dtype: float16 ```