--- base_model: - Undi95/Toppy-M-7B - SanjiWatsuki/Kunoichi-DPO-v2-7B - Epiculous/Fett-uccine-7B - NeverSleep/Noromaid-7B-0.4-DPO library_name: transformers tags: - mergekit - merge license: apache-2.0 --- # merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [Model Stock](https://arxiv.org/abs/2403.19522) merge method using [SanjiWatsuki/Kunoichi-DPO-v2-7B](https://huggingface.co/SanjiWatsuki/Kunoichi-DPO-v2-7B) as a base. ### Models Merged The following models were included in the merge: * [Undi95/Toppy-M-7B](https://huggingface.co/Undi95/Toppy-M-7B) * [Epiculous/Fett-uccine-7B](https://huggingface.co/Epiculous/Fett-uccine-7B) * [NeverSleep/Noromaid-7B-0.4-DPO](https://huggingface.co/NeverSleep/Noromaid-7B-0.4-DPO) ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: NeverSleep/Noromaid-7B-0.4-DPO - model: SanjiWatsuki/Kunoichi-DPO-v2-7B - model: Undi95/Toppy-M-7B - model: Epiculous/Fett-uccine-7B merge_method: model_stock base_model: SanjiWatsuki/Kunoichi-DPO-v2-7B dtype: bfloat16 ```