--- base_model: - ibm/merlinite-7b - mistralai/Mistral-7B-v0.1 - l3utterfly/mistral-7b-v0.1-layla-v4 - SanjiWatsuki/Sonya-7B - NeverSleep/Noromaid-7b-v0.2 - migtissera/SynthIA-7B-v1.3 library_name: transformers tags: - mergekit - merge --- ![](mistressmaid.png) # Franken-MistressMaid-7B This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details See Below ### Merge Method This model was merged using the [TIES](https://arxiv.org/abs/2306.01708) merge method using [mistralai/Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1) as a base. ### Models Merged The following models were included in the merge: * [ibm/merlinite-7b](https://huggingface.co/ibm/merlinite-7b) * [l3utterfly/mistral-7b-v0.1-layla-v4](https://huggingface.co/l3utterfly/mistral-7b-v0.1-layla-v4) * [SanjiWatsuki/Sonya-7B](https://huggingface.co/SanjiWatsuki/Sonya-7B) * [NeverSleep/Noromaid-7b-v0.2](https://huggingface.co/NeverSleep/Noromaid-7b-v0.2) * [migtissera/SynthIA-7B-v1.3](https://huggingface.co/migtissera/SynthIA-7B-v1.3) ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: migtissera/SynthIA-7B-v1.3 parameters: weight: 1 density: 1 - model: ibm/merlinite-7b parameters: weight: 0.3 - model: SanjiWatsuki/Sonya-7B parameters: weight: 0.2 - model: NeverSleep/Noromaid-7b-v0.2 parameters: weight: 0.2 - model: l3utterfly/mistral-7b-v0.1-layla-v4 parameters: weight: 0.2 merge_method: ties base_model: mistralai/Mistral-7B-v0.1 parameters: density: 0.4 int8_mask: true normalize: true dtype: bfloat16 ```