BigWeave-v6-90b / README.md
llmixer's picture
Create README.md
8b52305 verified
|
raw
history blame
1.24 kB
metadata
license: llama2
language:
  - en
pipeline_tag: conversational
tags:
  - Xwin
  - Euryale 1.3
  - frankenmerge
  - 90b

BigWeave v6 90B

A Goliath-120b style frankenmerge of Xwin-LM-70b-v0.1 and Euryale-1.3-70b. The goal is to find other merge combinations that work well.

The version number is for me to keep track of the merges, only results that seem to work reasonably well are kept/published.

Prompting Format

Vicuna and Alpaca.

Merge process

The models used in the merge are Xwin-LM-70b-v0.1 and Euryale-1.3-70b.

The layer mix:

- range 0, 12
  Xwin
- range 9, 14
  Euryale
- range 12, 62
  Xwin
- range 54, 71
  Euryale
- range 62, 80
  Xwin

Acknowledgements

@Xwin-LM For creating Xwin

@Sao10K For creating Euryale

@alpindale For creating the original Goliath

@chargoddard For developing mergekit.