LCARS_TOP_SCORE / README.md
LeroyDyer's picture
Upload folder using huggingface_hub
1a4ff37 verified
|
raw
history blame
1.52 kB
metadata
base_model:
  - chihoonlee10/T3Q-Mistral-Orca-Math-DPO
  - yam-peleg/Experiment26-7B
  - liminerity/M7-7b
  - LeroyDyer/Mixtral_AI_Cyber_3.1_SFT
library_name: transformers
tags:
  - mergekit
  - merge

merge

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the TIES merge method using LeroyDyer/Mixtral_AI_Cyber_3.1_SFT as a base.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:


models:
  - model: liminerity/M7-7b
    parameters:
      density: [0.87, 0.721, 0.451] # density gradient
      weight: 0.876
  - model: chihoonlee10/T3Q-Mistral-Orca-Math-DPO
    parameters:
      density: 0.232
      weight: [0.36, 0.3, 0.437, 0.76] # weight gradient
  - model: yam-peleg/Experiment26-7B
    parameters:
      density: 0.475
      weight:
        - filter: mlp
          value: 0.5
        - value: 0
merge_method: ties
base_model: LeroyDyer/Mixtral_AI_Cyber_3.1_SFT
parameters:
  normalize: true
  int8_mask: true
dtype: float16