|
--- |
|
base_model: v000000/L3-8B-UGI-DontPlanToEnd-test |
|
library_name: transformers |
|
tags: |
|
- mergekit |
|
- merge |
|
- llama |
|
- llama-cpp |
|
--- |
|
|
|
# v000000/L3-8B-UGI-DontPlanToEnd-test-GGUF |
|
This model was converted to GGUF format from [`v000000/L3-8B-UGI-DontPlanToEnd-test`](https://huggingface.co/v000000/L3-8B-UGI-DontPlanToEnd-test) using llama.cpp |
|
Refer to the [original model card](https://huggingface.co/v000000/L3-8B-UGI-DontPlanToEnd-test) for more details on the model.' |
|
|
|
# List of quants in repo: |
|
* Q8_0 imatrix |
|
* FP16 in .GGUF format for llama.cpp inference. |
|
|
|
### test-test-test-test request |
|
|
|
```bash |
|
num_battles: 40982 |
|
num_wins: 23536 |
|
celo_rating: 1214.08 |
|
safety_score: 0.96 |
|
propriety_score: 0.7065322049358942 |
|
propriety_total_count: 19733.0 |
|
``` |
|
|
|
# merge |
|
|
|
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). |
|
|
|
## Merge Details |
|
### Merge Method |
|
|
|
This model was merged using the [DARE](https://arxiv.org/abs/2311.03099) [TIES](https://arxiv.org/abs/2306.01708) merge method using [tannedbum/L3-Nymeria-8B](https://huggingface.co/tannedbum/L3-Nymeria-8B) as a base. |
|
|
|
### Models Merged |
|
|
|
The following models were included in the merge: |
|
* [PJMixers/LLaMa-3-CursedStock-v1.8-8B](https://huggingface.co/PJMixers/LLaMa-3-CursedStock-v1.8-8B) |
|
* [bluuwhale/L3-SthenoMaidBlackroot-8B-V1](https://huggingface.co/bluuwhale/L3-SthenoMaidBlackroot-8B-V1) |
|
* [v000000/L3-8B-Poppy-Sunspice](https://huggingface.co/v000000/L3-8B-Poppy-Sunspice) |
|
* [Nitral-AI/Hathor_RP-v.01-L3-8B](https://huggingface.co/Nitral-AI/Hathor_RP-v.01-L3-8B) |
|
|
|
### Configuration |
|
|
|
The following YAML configuration was used to produce this model: |
|
|
|
```yaml |
|
models: |
|
- model: PJMixers/LLaMa-3-CursedStock-v1.8-8B |
|
parameters: |
|
weight: 0.15 |
|
density: 0.57 |
|
- model: v000000/L3-8B-Poppy-Sunspice |
|
parameters: |
|
weight: 0.20 |
|
density: 0.69 |
|
- model: Nitral-AI/Hathor_RP-v.01-L3-8B |
|
parameters: |
|
weight: 0.30 |
|
density: 0.8 |
|
- model: bluuwhale/L3-SthenoMaidBlackroot-8B-V1 |
|
parameters: |
|
weight: 0.35 |
|
density: 0.85 |
|
merge_method: dare_ties |
|
base_model: tannedbum/L3-Nymeria-8B |
|
parameters: |
|
int8_mask: true |
|
dtype: bfloat16 |
|
``` |
|
|