merge

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the SLERP merge method.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:


slices:
 - sources:
     - model: 0x0dad0/nous_nb00
       layer_range: [0, 18]
     - model: Sumail/Bubble_bee04_2b
       layer_range: [0, 18]
merge_method: slerp
base_model: 0x0dad0/nous_nb00
parameters:
 t:
   - filter: self_attn
     value: [0, 0.5, 0.3, 0.7, 1]
   - filter: mlp
     value: [1, 0.5, 0.7, 0.3, 0]
   - value: 0.5
dtype: bfloat16
Downloads last month
79
Safetensors
Model size
2.51B params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for Sumail/Golden_Waves04_2b

Finetuned
(2)
this model