v000000's picture
Update README.md
d07ba47 verified
---
base_model:
- Sao10K/L3-8B-Stheno-v3.1
- Nitral-Archive/Poppy_Porpoise-Biomix
- Hastagaras/HALU-8B-LLAMA3-BRSLURP
- crestf411/L3-8B-sunfall-abliterated-v0.1
- cgato/L3-TheSpice-8b-v0.8.3
- Nitral-AI/Poppy_Porpoise-0.72-L3-8B
library_name: transformers
tags:
- mergekit
- merge
- not-for-all-audiences
---
Why choose?
The full model name is "Llama-3-8B-Poppy-Sunspice"
RP Model, pretty creative.
![image/png](https://cdn-uploads.huggingface.co/production/uploads/64f74b6e6389380c77562762/YsE0lCbC-5EjYS3d7-Iey.png)
# Thanks mradermacher for the quants
* [GGUF](https://huggingface.co/mradermacher/L3-8B-Poppy-Sunspice-GGUF)
* [GGUF imatrix](https://huggingface.co/mradermacher/L3-8B-Poppy-Sunspice-i1-GGUF)
# Thanks MarsupialAI for the quants
* [EXL2](https://huggingface.co/MarsupialAI/L3-8B-Poppy-Sunspice_EXL2)
# Update/Notice:
This has a tendency for endless generations. It likes a bit of penalty parameters if this occurs.
# merge
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was merged using the [linear](https://arxiv.org/abs/2203.05482) merge method.
### Models Merged
The following models were included in the merge:
* [Sao10K/L3-8B-Stheno-v3.1](https://huggingface.co/Sao10K/L3-8B-Stheno-v3.1)
* [Nitral-Archive/Poppy_Porpoise-Biomix](https://huggingface.co/Nitral-Archive/Poppy_Porpoise-Biomix)
* [Hastagaras/HALU-8B-LLAMA3-BRSLURP](https://huggingface.co/Hastagaras/HALU-8B-LLAMA3-BRSLURP)
* [crestf411/L3-8B-sunfall-abliterated-v0.1](https://huggingface.co/crestf411/L3-8B-sunfall-abliterated-v0.1)
* [cgato/L3-TheSpice-8b-v0.8.3](https://huggingface.co/cgato/L3-TheSpice-8b-v0.8.3)
* [Nitral-AI/Poppy_Porpoise-0.72-L3-8B](https://huggingface.co/Nitral-AI/Poppy_Porpoise-0.72-L3-8B)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
models:
- model: crestf411/L3-8B-sunfall-abliterated-v0.1
parameters:
weight: 0.1
- model: Nitral-AI/Poppy_Porpoise-0.72-L3-8B
parameters:
weight: 0.3
- model: Nitral-Archive/Poppy_Porpoise-Biomix
parameters:
weight: 0.1
- model: Sao10K/L3-8B-Stheno-v3.1
parameters:
weight: 0.2
- model: Hastagaras/HALU-8B-LLAMA3-BRSLURP
parameters:
weight: 0.1
- model: cgato/L3-TheSpice-8b-v0.8.3
parameters:
weight: 0.2
merge_method: linear
dtype: float16
```
# Prompt Template:
```bash
<|begin_of_text|><|start_header_id|>system<|end_header_id|>
{system_prompt}<|eot_id|><|start_header_id|>user<|end_header_id|>
{input}<|eot_id|><|start_header_id|>assistant<|end_header_id|>
{output}<|eot_id|>
```