32kTest_7B / README.md
jeiku's picture
Update README.md
1154759 verified
---
base_model:
- jeiku/Zephyr_beta_32k_7B
- jeiku/Synthetic_Soul_1k_Mistral_128
- jeiku/Zephyr_beta_32k_7B
- jeiku/Theory_of_Mind_Mistral
- jeiku/Zephyr_beta_32k_7B
- monsterapi/mistral_7b_norobots
- jeiku/Zephyr_beta_32k_7B
- jeiku/Zephyr_beta_32k_7B
- monsterapi/mistral_7b_WizardLMEvolInstruct70k
library_name: transformers
tags:
- mergekit
- merge
license: apache-2.0
---
# Test
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was merged using the [Model Stock](https://arxiv.org/abs/2403.19522) merge method using [jeiku/Zephyr_beta_32k_7B](https://huggingface.co/jeiku/Zephyr_beta_32k_7B) as a base.
### Models Merged
The following models were included in the merge:
* [jeiku/Zephyr_beta_32k_7B](https://huggingface.co/jeiku/Zephyr_beta_32k_7B) + [jeiku/Synthetic_Soul_1k_Mistral_128](https://huggingface.co/jeiku/Synthetic_Soul_1k_Mistral_128)
* [jeiku/Zephyr_beta_32k_7B](https://huggingface.co/jeiku/Zephyr_beta_32k_7B) + [jeiku/Theory_of_Mind_Mistral](https://huggingface.co/jeiku/Theory_of_Mind_Mistral)
* [jeiku/Zephyr_beta_32k_7B](https://huggingface.co/jeiku/Zephyr_beta_32k_7B) + [monsterapi/mistral_7b_norobots](https://huggingface.co/monsterapi/mistral_7b_norobots)
* [jeiku/Zephyr_beta_32k_7B](https://huggingface.co/jeiku/Zephyr_beta_32k_7B) + [monsterapi/mistral_7b_WizardLMEvolInstruct70k](https://huggingface.co/monsterapi/mistral_7b_WizardLMEvolInstruct70k)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
models:
- model: jeiku/Zephyr_beta_32k_7B+monsterapi/mistral_7b_WizardLMEvolInstruct70k
- model: jeiku/Zephyr_beta_32k_7B+jeiku/Synthetic_Soul_1k_Mistral_128
- model: jeiku/Zephyr_beta_32k_7B+jeiku/Theory_of_Mind_Mistral
- model: jeiku/Zephyr_beta_32k_7B+monsterapi/mistral_7b_norobots
merge_method: model_stock
base_model: jeiku/Zephyr_beta_32k_7B
dtype: bfloat16
```