File size: 1,372 Bytes
c50cbdc 7cc1113 5022459 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 |
---
license: other
license_name: yi-license
license_link: https://huggingface.co/01-ai/Yi-34B-200K/blob/main/LICENSE
---
This is a 4.65bpw quantized version of [DrNicefellow/ChatAllInOne-Yi-34B-200K-V1](https://huggingface.co/DrNicefellow/ChatAllInOne-Yi-34B-200K-V1) made with [exllamav2](https://github.com/turboderp/exllamav2).
## Model Details
- **Base Model**: [01-ai/Yi-34B-200K](https://huggingface.co/01-ai/Yi-34B-200K)
- **Fine-tuning Technique**: QLoRA (Quantum Logic-based Reasoning Approach)
- **Dataset**: [CHAT-ALL-IN-ONE-v1](https://huggingface.co/datasets/DrNicefellow/CHAT-ALL-IN-ONE-v1)
- **Tool Used for Fine-tuning**: [unsloth](https://github.com/unslothai/unsloth)
## Features
- Enhanced understanding and generation of conversational language.
- Improved performance in diverse chat scenarios, including casual, formal, and domain-specific conversations.
- Fine-tuned to maintain context and coherence over longer dialogues.
## Prompt Format
Vicuna 1.1
See the finetuning dataset for examples.
## License
This model is open-sourced under the [Yi License](https://huggingface.co/01-ai/Yi-34B-200K/blob/main/LICENSE).
## Feeling Generous? 😊
Eager to buy me a cup of 2$ coffe or iced tea?🍵☕ Sure, here is the link: [https://ko-fi.com/drnicefellow](https://ko-fi.com/drnicefellow). Please add a note on which one you want me to drink? |