Mythorica-L3-8B / README.md
Arkana08's picture
Update README.md
57b3fc0 verified
---
library_name: transformers
license: llama3
tags:
- mergekit
- merge
---
### (GGUF) Thanks:
#### HumanBoiii
- **GGUF:** **[HumanBoiii/Mythorica-L3-8B-Q4_K_M-GGUF](https://huggingface.co/HumanBoiii/Mythorica-L3-8B-Q4_K_M-GGUF)**
---
**Mythorica** - RP model designed for generating vivid storytelling, engaging dialogues, and immersive world-building. Inspired by the fusion of fantasy and realism, Mythorica excels at crafting intricate narratives and breathing life into characters, making it a versatile choice for writers, roleplayers.
---
### Merge Method
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
This model was merged using the [DARE](https://arxiv.org/abs/2311.03099) [TIES](https://arxiv.org/abs/2306.01708).
### Models Merged
The following models were included in the merge:
- **[ChaoticNeutrals/Hathor_Tahsin-L3-8B-v0.9](https://huggingface.co/ChaoticNeutrals/Hathor_Tahsin-L3-8B-v0.9)**
- **[Sao10K/L3-8B-Chara-v1-Alpha](https://huggingface.co/Sao10K/L3-8B-Chara-v1-Alpha)**
- **[Arkana08/LexiMaid-L3-8B](https://huggingface.co/Arkana08/LexiMaid-L3-8B)**
## Configuration
The following YAML configuration was used to produce Mythorica:
```yaml
models:
- model: ChaoticNeutrals/Hathor_Tahsin-L3-8B-v0.9
parameters:
weight: 0.5
density: 0.8
- model: Arkana08/LexiMaid-L3-8B
parameters:
weight: 0.3
density: 0.7
- model: Sao10K/L3-8B-Chara-v1-Alpha
parameters:
weight: 0.2
density: 0.75
merge_method: dare_ties
base_model: ChaoticNeutrals/Hathor_Tahsin-L3-8B-v0.9
parameters:
int8_mask: true
dtype: bfloat16
```
## Credits
Thanks to the creators of the models:
- **[ChaoticNeutrals/Hathor_Tahsin-L3-8B-v0.9](https://huggingface.co/ChaoticNeutrals/Hathor_Tahsin-L3-8B-v0.9)**
- **[Sao10K/L3-8B-Chara-v1-Alpha](https://huggingface.co/Sao10K/L3-8B-Chara-v1-Alpha)**