--- base_model: - ifable/gemma-2-Ifable-9B - unsloth/gemma-2-9b-it - IlyaGusev/gemma-2-9b-it-abliterated library_name: transformers tags: - mergekit - merge --- # output This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the della merge method using [unsloth/gemma-2-9b-it](https://huggingface.co/unsloth/gemma-2-9b-it) as a base. ### Models Merged The following models were included in the merge: * [ifable/gemma-2-Ifable-9B](https://huggingface.co/ifable/gemma-2-Ifable-9B) * [IlyaGusev/gemma-2-9b-it-abliterated](https://huggingface.co/IlyaGusev/gemma-2-9b-it-abliterated) ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: ifable/gemma-2-Ifable-9B parameters: density: 0.4 weight: 0.5 epsilon: 0.1 lambda: 1.2 - model: IlyaGusev/gemma-2-9b-it-abliterated parameters: density: 0.6 weight: 0.5 epsilon: 0.05 lambda: 1.0 merge_method: della base_model: unsloth/gemma-2-9b-it parameters: normalize: false int8_mask: true dtype: float16 ```