--- base_model: - Nohobby/MN-12B-Siskin-TEST3 - Nohobby/MN-12B-Siskin-v0.1 - MarinaraSpaghetti/NemoMix-Unleashed-12B library_name: transformers tags: - mergekit - merge --- # MN-12B-Siskin-TEST4b So the entire MN-12B-Siskin-TEST* series turned out badly. Uh oh. I leave it here for archival purposes or something. If you want to use a real model, consider [MN-12B-Siskin-v0.1](https://huggingface.co/Nohobby/MN-12B-Siskin-v0.1) or [MN-12B-Siskin-v0.2](https://huggingface.co/Nohobby/MN-12B-Siskin-v0.2) ## Merge Details ### Merge Method This model was merged using the [Model Stock](https://arxiv.org/abs/2403.19522) merge method using [Nohobby/MN-12B-Siskin-v0.1](https://huggingface.co/Nohobby/MN-12B-Siskin-v0.1) as a base. ### Models Merged The following models were included in the merge: * [Nohobby/MN-12B-Siskin-TEST3](https://huggingface.co/Nohobby/MN-12B-Siskin-TEST3) * [MarinaraSpaghetti/NemoMix-Unleashed-12B](https://huggingface.co/MarinaraSpaghetti/NemoMix-Unleashed-12B) ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: Nohobby/MN-12B-Siskin-TEST3 - model: MarinaraSpaghetti/NemoMix-Unleashed-12B merge_method: model_stock base_model: Nohobby/MN-12B-Siskin-v0.1 dtype: bfloat16 ```