metadata
license: apache-2.0
tags:
- mergekit
- merge
Nyakura-CausalLM-RP-34B
This is a merge of pre-trained language models created using mergekit.
GGUFs by mradermacher if you want 'em:
https://huggingface.co/mradermacher/Nyakura-CausalLM-RP-34B-i1-GGUF
Merge Details
Just uploading one of my stew's merge pieces here for convenience. If you want to use it you can. ChatML should work good for the format.
Merge Method
This model was merged using the SLERP merge method.
Models Merged
The following models were included in the merge:
https://huggingface.co/NeverSleep/CausalLM-RP-34B
https://huggingface.co/Sao10K/NyakuraV2-34B-Yi-Llama
Configuration
The following YAML configuration was used to produce this model:
slices:
- sources:
- model: CausalLM-RP-34B
layer_range: [0, 60]
- model: NyakuraV2-34B-Yi-Llama
layer_range: [0, 60]
merge_method: slerp
base_model: CausalLM-RP-34B
parameters:
t:
- filter: self_attn
value: [0, 0.5, 0.3, 0.7, 1]
- filter: mlp
value: [1, 0.5, 0.7, 0.3, 0]
- value: 0.5
dtype: bfloat16