Edit model card

Chimera 120b

img

An auto-regressive causal LM created by combining 3x finetuned models into one via passthrough merging slices in a stacked order. This model is a sliced passthrough merge of Sheep Duck, Xwin, and Sythia. I wanted to make Sheep Duck part of Giant Macaroni but the Marconi and Sheep Duck models don't line up (I think because of the rotary embedding). Honestly, without the fine tuning yet, I think this model might be slightly better at my logic and reason tests. I'll try to update more here as I go and release more versions as I can fine tune these more.

Prompting Format

Both Vicuna and Alpaca will work, but due the final layers belonging primarily to Xwin.

Benchmarks

Coming soon.

Acknowledgements

@chargoddard - mergekit. @migtissera - for Tess-XL which inspired me to believe that open models can compete on logic tasks with the big commercial models. @alpindale - for Goliath-120B that started this crazy endeavor for us all @nsfwthrowitaway69 - for sharing the merge config for Venus-120B and getting me off the starting block with some questions on mergekit and tokenizers

Keep it open and keep sharing everyone! With Mixtral and MOE changes to mergekit coupled with these larger merged models? I think the sky is the limit for us all. I can only imagine what will happen if we took a group of these 120 models, fin tuned them each a bit and applied the MOE Mixtral merge method to them? I would also point out that if a clever VC came along and funded that work? You have the people you need right here on huggingface and all they need is the equipment to do it on.

Downloads last month
17
Safetensors
Model size
120B params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for ibivibiv/chimera-120b

Quantizations
2 models

Collection including ibivibiv/chimera-120b