miquella-120b / README.md
alpindale's picture
Upload folder using huggingface_hub
0a9ed09 verified
|
raw
history blame
597 Bytes
metadata
base_model: []
tags:
  - mergekit
  - merge

Miquella 120B

This is a merge of pre-trained language models created using mergekit. An attempt at re-creating goliath-120b using the new miqu-1-70b model instead of Xwin.

The merge ratios are the same as goliath, only that Xwin is swapped with miqu.

Models Merged

The following models were included in the merge: