Description
This repo contains quantized files of Toppy-Mix-4x7B.
This project was originaly a request from BlueNipples : link
The difference with the OG Toppy-M is the addition of Noromaid with the 3 models used to do Toppy-M, to have all the model as Expert in this MoE model, and not just merged one into one.
WARNING: ALL THE "K" GGUF QUANT OF MIXTRAL MODELS SEEMS TO BE BROKEN, PREFER Q4_0, Q5_0 or Q8_0!
Models and loras used
- openchat/openchat_3.5
- NousResearch/Nous-Capybara-7B-V1.9
- HuggingFaceH4/zephyr-7b-beta
- NeverSleep/Noromaid-13B-v0.1.1
- lemonilia/AshhLimaRP-Mistral-7B
- Vulkane/120-Days-of-Sodom-LoRA-Mistral-7b
- Undi95/Mistral-pippa-sharegpt-7b-qlora
Prompt template: Alpaca
Below is an instruction that describes a task. Write a response that appropriately completes the request.
### Instruction:
{prompt}
### Response:
If you want to support me, you can here.
- Downloads last month
- 104