brucethemoose's picture
Update README.md
2084674 verified
|
raw
history blame
2.19 kB
metadata
license: other
license_name: yi-license
license_link: https://huggingface.co/01-ai/Yi-34B/blob/main/LICENSE
language:
  - en
library_name: transformers
base_model: []
tags:
  - mergekit
  - merge
  - Yi
  - exllama
  - exllamav2
  - exl2

RPmerge

See the main model card: https://huggingface.co/brucethemoose/Yi-34B-200K-RPMerge

Quantized with default exl2 quantization, still investigating the benefits/drawbacks of long context (32K) quantization.

Merge Details

Merge Method

This model was merged using the DARE TIES merge method using /home/alpha/Models/Raw/chargoddard_Yi-34B-200K-Llama as a base.

Models Merged

The following models were included in the merge:

  • /home/alpha/Models/Raw/migtissera_Tess-34B-v1.5b
  • /home/alpha/Models/Raw/migtissera_Tess-M-Creative-v1.0
  • /home/alpha/Models/Raw/cgato_Thespis-34b-DPO-v0.7
  • /home/alpha/Models/Raw/Nous-Capybara-34B
  • /home/alpha/Models/Raw/admo_limarp
  • /home/alpha/Models/Raw/DrNicefellow_ChatAllInOne-Yi-34B-200K-V1

Configuration

The following YAML configuration was used to produce this model:

models:
  - model: /home/alpha/Models/Raw/chargoddard_Yi-34B-200K-Llama
    # No parameters necessary for base model
  - model: /home/alpha/Models/Raw/migtissera_Tess-34B-v1.5b
    #Emphasize the beginning of Vicuna format models
    parameters:
      weight: 0.19
      density: 0.59
  - model: /home/alpha/Models/Raw/Nous-Capybara-34B
    parameters:
      weight: 0.19
      density: 0.55
  # Vicuna format
  - model: /home/alpha/Models/Raw/migtissera_Tess-M-Creative-v1.0
    parameters:
      weight: 0.05
      density: 0.55
  - model: /home/alpha/Models/Raw/DrNicefellow_ChatAllInOne-Yi-34B-200K-V1
    parameters:
      weight: 0.19
      density: 0.55
  - model: /home/alpha/Models/Raw/admo_limarp
    parameters:
      weight: 0.19
      density: 0.48
  - model: /home/alpha/Models/Raw/cgato_Thespis-34b-DPO-v0.7
    parameters:
      weight: 0.19
      density: 0.59


merge_method: dare_ties
tokenizer_source: union
base_model: /home/alpha/Models/Raw/chargoddard_Yi-34B-200K-Llama
parameters:
  int8_mask: true
dtype: bfloat16