--- base_model: - arcee-ai/Llama-3.1-SuperNova-Lite - Azazelle/Nimue-8B - Replete-AI/L3-Pneuma-8B - DreadPoor/Everything-COT-8B-r128-LoRA - Nekochu/Luminia-8B-RP - ResplendentAI/Theory_of_Mind_Llama3 - NousResearch/Hermes-3-Llama-3.1-8B - kloodia/lora-8b-math - refuelai/Llama-3-Refueled - DreadPoor/OpenBioLLM-8B-r64-LoRA library_name: transformers tags: - mergekit - merge --- # merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [Model Stock](https://arxiv.org/abs/2403.19522) merge method using [arcee-ai/Llama-3.1-SuperNova-Lite](https://huggingface.co/arcee-ai/Llama-3.1-SuperNova-Lite) + [Azazelle/Nimue-8B](https://huggingface.co/Azazelle/Nimue-8B) as a base. ### Models Merged The following models were included in the merge: * [Replete-AI/L3-Pneuma-8B](https://huggingface.co/Replete-AI/L3-Pneuma-8B) + [DreadPoor/Everything-COT-8B-r128-LoRA](https://huggingface.co/DreadPoor/Everything-COT-8B-r128-LoRA) * [Nekochu/Luminia-8B-RP](https://huggingface.co/Nekochu/Luminia-8B-RP) + [ResplendentAI/Theory_of_Mind_Llama3](https://huggingface.co/ResplendentAI/Theory_of_Mind_Llama3) * [NousResearch/Hermes-3-Llama-3.1-8B](https://huggingface.co/NousResearch/Hermes-3-Llama-3.1-8B) + [kloodia/lora-8b-math](https://huggingface.co/kloodia/lora-8b-math) * [refuelai/Llama-3-Refueled](https://huggingface.co/refuelai/Llama-3-Refueled) + [DreadPoor/OpenBioLLM-8B-r64-LoRA](https://huggingface.co/DreadPoor/OpenBioLLM-8B-r64-LoRA) ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: refuelai/Llama-3-Refueled+DreadPoor/OpenBioLLM-8B-r64-LoRA - model: Replete-AI/L3-Pneuma-8B+DreadPoor/Everything-COT-8B-r128-LoRA - model: Nekochu/Luminia-8B-RP+ResplendentAI/Theory_of_Mind_Llama3 - model: NousResearch/Hermes-3-Llama-3.1-8B+kloodia/lora-8b-math merge_method: model_stock base_model: arcee-ai/Llama-3.1-SuperNova-Lite+Azazelle/Nimue-8B normalize: false int8_mask: true dtype: bfloat16 ```