--- base_model: - MrRobotoAI/Thor-v1.4-8b-DARK-FICTION - MrRobotoAI/8b-unaligned-BASE-v2y - MrRobotoAI/8b-unaligned-BASE-v2m library_name: transformers tags: - mergekit - merge --- # merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [TIES](https://arxiv.org/abs/2306.01708) merge method using [MrRobotoAI/Thor-v1.4-8b-DARK-FICTION](https://huggingface.co/MrRobotoAI/Thor-v1.4-8b-DARK-FICTION) as a base. ### Models Merged The following models were included in the merge: * [MrRobotoAI/8b-unaligned-BASE-v2y](https://huggingface.co/MrRobotoAI/8b-unaligned-BASE-v2y) * [MrRobotoAI/8b-unaligned-BASE-v2m](https://huggingface.co/MrRobotoAI/8b-unaligned-BASE-v2m) ### Configuration The following YAML configuration was used to produce this model: ```yaml merge_method: ties models: - model: MrRobotoAI/8b-unaligned-BASE-v2m parameters: weight: - filter: v_proj value: [1, 1, 1, 0.995, 0.995, 0.995, 0.995, 0.995, 1, 1, 1] - filter: o_proj value: [1, 1, 1, 0.995, 0.995, 0.995, 0.995, 0.995, 1, 1, 1] - filter: up_proj value: [1, 1, 1, 0.995, 0.995, 0.995, 0.995, 0.995, 1, 1, 1] - filter: gate_proj value: [1, 1, 1, 0.995, 0.995, 0.995, 0.995, 0.995, 1, 1, 1] - filter: down_proj value: [1, 1, 1, 0.995, 0.995, 0.995, 0.995, 0.995, 1, 1, 1] - value: 1 - model: MrRobotoAI/8b-unaligned-BASE-v2y parameters: weight: - filter: v_proj value: [0, 0, 0, 0.005, 0.005, 0.005, 0.005, 0.005, 0, 0, 0] - filter: o_proj value: [0, 0, 0, 0.005, 0.005, 0.005, 0.005, 0.005, 0, 0, 0] - filter: up_proj value: [0, 0, 0, 0.005, 0.005, 0.005, 0.005, 0.005, 0, 0, 0] - filter: gate_proj value: [0, 0, 0, 0.005, 0.005, 0.005, 0.005, 0.005, 0, 0, 0] - filter: down_proj value: [0, 0, 0, 0.005, 0.005, 0.005, 0.005, 0.005, 0, 0, 0] - value: 0 base_model: MrRobotoAI/Thor-v1.4-8b-DARK-FICTION tokenizer_source: base dtype: bfloat16 ```