Text Generation
Transformers
Safetensors
GGUF
English
mistral
Not-For-All-Audiences
Inference Endpoints
text-generation-inference
DaturaCookie_7B / README.md
jeiku's picture
Update README.md
755d702 verified
---
base_model:
- ResplendentAI/Datura_7B
- ChaoticNeutrals/Cookie_7B
library_name: transformers
tags:
- not-for-all-audiences
license: apache-2.0
datasets:
- ResplendentAI/Luna_NSFW_Text
- unalignment/toxic-dpo-v0.2
- ResplendentAI/Synthetic_Soul_1k
- grimulkan/theory-of-mind
- lemonilia/LimaRP
- PygmalionAI/PIPPA
language:
- en
---
# DaturaCookie
![image/png](https://cdn-uploads.huggingface.co/production/uploads/626dfb8786671a29c715f8a9/5jG2dft51fgPcGUGc-4Ym.png)
Proficient at roleplaying and lightehearted conversation, this model is prone to NSFW outputs.
# Vision/multimodal capabilities:
If you want to use vision functionality:
You must use the latest versions of Koboldcpp. To use the multimodal capabilities of this model and use vision you need to load the specified mmproj file, this can be found inside this model repo.
You can load the mmproj by using the corresponding section in the interface:
![image/png](https://cdn-uploads.huggingface.co/production/uploads/626dfb8786671a29c715f8a9/UxH8OteeRbD1av1re0yNZ.png)
### Models Merged
The following models were included in the merge:
* [ResplendentAI/Datura_7B](https://huggingface.co/ResplendentAI/Datura_7B)
* [ChaoticNeutrals/Cookie_7B](https://huggingface.co/ChaoticNeutrals/Cookie_7B)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
slices:
- sources:
- model: ChaoticNeutrals/Cookie_7B
layer_range: [0, 32]
- model: ResplendentAI/Datura_7B
layer_range: [0, 32]
merge_method: slerp
base_model: ResplendentAI/Datura_7B
parameters:
t:
- filter: self_attn
value: [1, 0.75, 0.5, 0.25, 0]
- filter: mlp
value: [0, 0.25, 0.5, 0.75, 1]
- value: 0.5
dtype: bfloat16
```