--- tags: - merge - mergekit --- # NeuraLake-m7-v2-7B⚡ NeuraLake-m7-v2-7B is a merge of the following models using [mergekit](https://github.com/cg123/mergekit): * [mlabonne/NeuralBeagle14-7B](https://huggingface.co/mlabonne/NeuralBeagle14-7B) * [chargoddard/loyal-piano-m7](https://huggingface.co/chargoddard/loyal-piano-m7) * [macadeliccc/WestLake-7B-v2-laser-truthy-dpo](https://huggingface.co/macadeliccc/WestLake-7B-v2-laser-truthy-dpo) * [athirdpath/NSFW_DPO_vmgb-7b](https://huggingface.co/athirdpath/NSFW_DPO_vmgb-7b) ## 🛠️ Configuration ```yaml models: - model: mistralai/Mistral-7B-v0.1 # No parameters necessary for base model - model: mlabonne/NeuralBeagle14-7B parameters: weight: 0.3 density: 0.8 - model: chargoddard/loyal-piano-m7 parameters: weight: 0.4 density: 0.8 - model: macadeliccc/WestLake-7B-v2-laser-truthy-dpo parameters: weight: 0.3 density: 0.4 - model: athirdpath/NSFW_DPO_vmgb-7b parameters: weight: 0.2 density: 0.4 merge_method: dare_ties base_model: mistralai/Mistral-7B-v0.1 parameters: int8_mask: true # normalize: true dtype: bfloat16 ```