metadata
base_model:
- migtissera/Tess-2.0-Llama-3-8B
- migtissera/Tess-2.0-Llama-3-8B
- Pedro13543/Nice_mix_LoRa
- THUDM/LongCite-llama3.1-8b
- huihui-ai/deepthought-8b-abliterated
- Undi95/Meta-Llama-3.1-8B-Claude
- FuseAI/FuseChat-Llama-3.1-8B-Instruct
- vicgalle/Configurable-Llama-3-8B-v0.3
- Skywork/Skywork-o1-Open-Llama-3.1-8B
- DreadPoor/abliteration-OVA-8B-r128-LORA
library_name: transformers
tags:
- mergekit
- merge
merge
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the Model Stock merge method using migtissera/Tess-2.0-Llama-3-8B + Pedro13543/Nice_mix_LoRa as a base.
Models Merged
The following models were included in the merge:
- migtissera/Tess-2.0-Llama-3-8B
- THUDM/LongCite-llama3.1-8b
- huihui-ai/deepthought-8b-abliterated
- Undi95/Meta-Llama-3.1-8B-Claude
- FuseAI/FuseChat-Llama-3.1-8B-Instruct
- vicgalle/Configurable-Llama-3-8B-v0.3
- Skywork/Skywork-o1-Open-Llama-3.1-8B + DreadPoor/abliteration-OVA-8B-r128-LORA
Configuration
The following YAML configuration was used to produce this model:
models:
- model: Skywork/Skywork-o1-Open-Llama-3.1-8B+DreadPoor/abliteration-OVA-8B-r128-LORA
- model: migtissera/Tess-2.0-Llama-3-8B
- model: FuseAI/FuseChat-Llama-3.1-8B-Instruct
- model: THUDM/LongCite-llama3.1-8b
- model: vicgalle/Configurable-Llama-3-8B-v0.3
- model: huihui-ai/deepthought-8b-abliterated
- model: Undi95/Meta-Llama-3.1-8B-Claude
merge_method: model_stock
base_model: migtissera/Tess-2.0-Llama-3-8B+Pedro13543/Nice_mix_LoRa
dtype: bfloat16