CultriX's picture
Update README.md
ef3d0dd verified
metadata
base_model:
  - CultriX/SeQwence-14Bv1
  - CultriX/Qwestion-14B
  - CultriX/Qwen2.5-14B-Wernicke
library_name: transformers
tags:
  - mergekit
  - merge
license: apache-2.0
language:
  - en
metrics:
  - accuracy
pipeline_tag: text-generation

final_model

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the DARE TIES merge method using CultriX/SeQwence-14Bv1 as a base.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

base_model: CultriX/SeQwence-14Bv1
dtype: bfloat16
merge_method: dare_ties
parameters:
  int8_mask: 1.0
  normalize: 1.0
slices:
- sources:
  - layer_range: [0, 48]
    model: CultriX/SeQwence-14Bv1
    parameters:
      density: [0.9723868064882017, 1.0, 1.0, 1.0, 1.0, 0.9714039829478123]
      weight: [0.303941801676895, 0.364404551023674, 0.315900913803921, 0.3276032249804535,
        0.32167313684876814, 0.4385348686221433]
  - layer_range: [0, 48]
    model: CultriX/Qwestion-14B
    parameters:
      density: [1.0, 0.9914516102369406, 1.0, 0.8035966798672015, 0.8192028457518323,
        0.9514479609471497]
      weight: [0.23754044230348376, 0.26302919982461254, 0.26313082788173275, 0.17815237275761467,
        0.34301750695974753, 0.5374787613924082]
  - layer_range: [0, 48]
    model: CultriX/Qwen2.5-14B-Wernicke
    parameters:
      density: [0.9250003667144193, 0.9603820599250329, 0.8766642760655986, 1.0, 0.9993615706551808,
        0.7459506348277176]
      weight: [0.48038202535582214, 0.5870170049221364, 0.27054455623315504, 0.06016442415521043,
        0.4012739361231067, 0.26890177448533076]