lemon07r's picture
init
7ad4a1b verified
---
base_model:
- ifable/gemma-2-Ifable-9B
- unsloth/gemma-2-9b-it
- wzhouad/gemma-2-9b-it-WPO-HB
- nbeerbower/Gemma2-Gutenberg-Doppel-9B
library_name: transformers
tags:
- mergekit
- merge
---
# Gemma-2-Ataraxy-v3j-9B
Err another test model. We near the end of all this though.
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was merged using the della merge method using [unsloth/gemma-2-9b-it](https://huggingface.co/unsloth/gemma-2-9b-it) as a base.
### Models Merged
The following models were included in the merge:
* [ifable/gemma-2-Ifable-9B](https://huggingface.co/ifable/gemma-2-Ifable-9B)
* [wzhouad/gemma-2-9b-it-WPO-HB](https://huggingface.co/wzhouad/gemma-2-9b-it-WPO-HB)
* [nbeerbower/Gemma2-Gutenberg-Doppel-9B](https://huggingface.co/nbeerbower/Gemma2-Gutenberg-Doppel-9B)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
base_model: unsloth/gemma-2-9b-it
dtype: bfloat16
merge_method: della
parameters:
epsilon: 0.1
int8_mask: 1.0
lambda: 1.0
normalize: 1.0
slices:
- sources:
- layer_range: [0, 42]
model: unsloth/gemma-2-9b-it
- layer_range: [0, 42]
model: nbeerbower/Gemma2-Gutenberg-Doppel-9B
parameters:
density: 0.55
weight: 0.6
- layer_range: [0, 42]
model: wzhouad/gemma-2-9b-it-WPO-HB
parameters:
density: 0.35
weight: 0.6
- layer_range: [0, 42]
model: ifable/gemma-2-Ifable-9B
parameters:
density: 0.25
weight: 0.4
```