Chronos-Prism_V2.0 / README.md
Triangle104's picture
Update README.md
26c1f24 verified
---
base_model:
- ArliAI/Mistral-Nemo-12B-ArliAI-RPMax-v1.3
- nbeerbower/Mistral-Nemo-Prism-12B-v7
- elinas/Chronos-Gold-12B-1.0
library_name: transformers
tags:
- mergekit
- merge
license: apache-2.0
---
Model details
-
![image/webp](https://cdn-uploads.huggingface.co/production/uploads/66c1cc08453a7ef6c5fe657a/6igjo4XoUKl9P57E18L1A.webp)
*The sequel no one asked for*
Feedback is welcome, and advised.
---
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
### Merge Method
This model was merged using the [DARE](https://arxiv.org/abs/2311.03099) [TIES](https://arxiv.org/abs/2306.01708) merge method using [nbeerbower/Mistral-Nemo-Prism-12B-v7](https://huggingface.co/nbeerbower/Mistral-Nemo-Prism-12B-v7) as a base.
### Models Merged
The following models were included in the merge:
* [ArliAI/Mistral-Nemo-12B-ArliAI-RPMax-v1.3](https://huggingface.co/ArliAI/Mistral-Nemo-12B-ArliAI-RPMax-v1.3)
* [elinas/Chronos-Gold-12B-1.0](https://huggingface.co/elinas/Chronos-Gold-12B-1.0)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
models:
- model: nbeerbower/Mistral-Nemo-Prism-12B-v7
#no parameters necessary for base model
- model: ArliAI/Mistral-Nemo-12B-ArliAI-RPMax-v1.3
parameters:
density: 0.5
weight: 0.5
- model: elinas/Chronos-Gold-12B-1.0
parameters:
density: 0.5
weight: 0.5
merge_method: dare_ties
base_model: nbeerbower/Mistral-Nemo-Prism-12B-v7
parameters:
normalize: false
int8_mask: true
dtype: float16
```