|
Quantization made by Richard Erkhov. |
|
|
|
[Github](https://github.com/RichardErkhov) |
|
|
|
[Discord](https://discord.gg/pvy7H8DZMG) |
|
|
|
[Request more models](https://github.com/RichardErkhov/quant_request) |
|
|
|
|
|
ft-solar-10.7b-v2.1-dpo - GGUF |
|
- Model creator: https://huggingface.co/ifuseok/ |
|
- Original model: https://huggingface.co/ifuseok/ft-solar-10.7b-v2.1-dpo/ |
|
|
|
|
|
| Name | Quant method | Size | |
|
| ---- | ---- | ---- | |
|
| [ft-solar-10.7b-v2.1-dpo.Q2_K.gguf](https://huggingface.co/RichardErkhov/ifuseok_-_ft-solar-10.7b-v2.1-dpo-gguf/blob/main/ft-solar-10.7b-v2.1-dpo.Q2_K.gguf) | Q2_K | 3.73GB | |
|
| [ft-solar-10.7b-v2.1-dpo.IQ3_XS.gguf](https://huggingface.co/RichardErkhov/ifuseok_-_ft-solar-10.7b-v2.1-dpo-gguf/blob/main/ft-solar-10.7b-v2.1-dpo.IQ3_XS.gguf) | IQ3_XS | 4.14GB | |
|
| [ft-solar-10.7b-v2.1-dpo.IQ3_S.gguf](https://huggingface.co/RichardErkhov/ifuseok_-_ft-solar-10.7b-v2.1-dpo-gguf/blob/main/ft-solar-10.7b-v2.1-dpo.IQ3_S.gguf) | IQ3_S | 4.37GB | |
|
| [ft-solar-10.7b-v2.1-dpo.Q3_K_S.gguf](https://huggingface.co/RichardErkhov/ifuseok_-_ft-solar-10.7b-v2.1-dpo-gguf/blob/main/ft-solar-10.7b-v2.1-dpo.Q3_K_S.gguf) | Q3_K_S | 4.34GB | |
|
| [ft-solar-10.7b-v2.1-dpo.IQ3_M.gguf](https://huggingface.co/RichardErkhov/ifuseok_-_ft-solar-10.7b-v2.1-dpo-gguf/blob/main/ft-solar-10.7b-v2.1-dpo.IQ3_M.gguf) | IQ3_M | 4.51GB | |
|
| [ft-solar-10.7b-v2.1-dpo.Q3_K.gguf](https://huggingface.co/RichardErkhov/ifuseok_-_ft-solar-10.7b-v2.1-dpo-gguf/blob/main/ft-solar-10.7b-v2.1-dpo.Q3_K.gguf) | Q3_K | 4.84GB | |
|
| [ft-solar-10.7b-v2.1-dpo.Q3_K_M.gguf](https://huggingface.co/RichardErkhov/ifuseok_-_ft-solar-10.7b-v2.1-dpo-gguf/blob/main/ft-solar-10.7b-v2.1-dpo.Q3_K_M.gguf) | Q3_K_M | 4.84GB | |
|
| [ft-solar-10.7b-v2.1-dpo.Q3_K_L.gguf](https://huggingface.co/RichardErkhov/ifuseok_-_ft-solar-10.7b-v2.1-dpo-gguf/blob/main/ft-solar-10.7b-v2.1-dpo.Q3_K_L.gguf) | Q3_K_L | 5.26GB | |
|
| [ft-solar-10.7b-v2.1-dpo.IQ4_XS.gguf](https://huggingface.co/RichardErkhov/ifuseok_-_ft-solar-10.7b-v2.1-dpo-gguf/blob/main/ft-solar-10.7b-v2.1-dpo.IQ4_XS.gguf) | IQ4_XS | 5.43GB | |
|
| [ft-solar-10.7b-v2.1-dpo.Q4_0.gguf](https://huggingface.co/RichardErkhov/ifuseok_-_ft-solar-10.7b-v2.1-dpo-gguf/blob/main/ft-solar-10.7b-v2.1-dpo.Q4_0.gguf) | Q4_0 | 5.66GB | |
|
| [ft-solar-10.7b-v2.1-dpo.IQ4_NL.gguf](https://huggingface.co/RichardErkhov/ifuseok_-_ft-solar-10.7b-v2.1-dpo-gguf/blob/main/ft-solar-10.7b-v2.1-dpo.IQ4_NL.gguf) | IQ4_NL | 5.72GB | |
|
| [ft-solar-10.7b-v2.1-dpo.Q4_K_S.gguf](https://huggingface.co/RichardErkhov/ifuseok_-_ft-solar-10.7b-v2.1-dpo-gguf/blob/main/ft-solar-10.7b-v2.1-dpo.Q4_K_S.gguf) | Q4_K_S | 5.7GB | |
|
| [ft-solar-10.7b-v2.1-dpo.Q4_K.gguf](https://huggingface.co/RichardErkhov/ifuseok_-_ft-solar-10.7b-v2.1-dpo-gguf/blob/main/ft-solar-10.7b-v2.1-dpo.Q4_K.gguf) | Q4_K | 6.02GB | |
|
| [ft-solar-10.7b-v2.1-dpo.Q4_K_M.gguf](https://huggingface.co/RichardErkhov/ifuseok_-_ft-solar-10.7b-v2.1-dpo-gguf/blob/main/ft-solar-10.7b-v2.1-dpo.Q4_K_M.gguf) | Q4_K_M | 6.02GB | |
|
| [ft-solar-10.7b-v2.1-dpo.Q4_1.gguf](https://huggingface.co/RichardErkhov/ifuseok_-_ft-solar-10.7b-v2.1-dpo-gguf/blob/main/ft-solar-10.7b-v2.1-dpo.Q4_1.gguf) | Q4_1 | 6.27GB | |
|
| [ft-solar-10.7b-v2.1-dpo.Q5_0.gguf](https://huggingface.co/RichardErkhov/ifuseok_-_ft-solar-10.7b-v2.1-dpo-gguf/blob/main/ft-solar-10.7b-v2.1-dpo.Q5_0.gguf) | Q5_0 | 6.89GB | |
|
| [ft-solar-10.7b-v2.1-dpo.Q5_K_S.gguf](https://huggingface.co/RichardErkhov/ifuseok_-_ft-solar-10.7b-v2.1-dpo-gguf/blob/main/ft-solar-10.7b-v2.1-dpo.Q5_K_S.gguf) | Q5_K_S | 6.89GB | |
|
| [ft-solar-10.7b-v2.1-dpo.Q5_K.gguf](https://huggingface.co/RichardErkhov/ifuseok_-_ft-solar-10.7b-v2.1-dpo-gguf/blob/main/ft-solar-10.7b-v2.1-dpo.Q5_K.gguf) | Q5_K | 7.08GB | |
|
| [ft-solar-10.7b-v2.1-dpo.Q5_K_M.gguf](https://huggingface.co/RichardErkhov/ifuseok_-_ft-solar-10.7b-v2.1-dpo-gguf/blob/main/ft-solar-10.7b-v2.1-dpo.Q5_K_M.gguf) | Q5_K_M | 7.08GB | |
|
| [ft-solar-10.7b-v2.1-dpo.Q5_1.gguf](https://huggingface.co/RichardErkhov/ifuseok_-_ft-solar-10.7b-v2.1-dpo-gguf/blob/main/ft-solar-10.7b-v2.1-dpo.Q5_1.gguf) | Q5_1 | 7.51GB | |
|
| [ft-solar-10.7b-v2.1-dpo.Q6_K.gguf](https://huggingface.co/RichardErkhov/ifuseok_-_ft-solar-10.7b-v2.1-dpo-gguf/blob/main/ft-solar-10.7b-v2.1-dpo.Q6_K.gguf) | Q6_K | 8.2GB | |
|
| [ft-solar-10.7b-v2.1-dpo.Q8_0.gguf](https://huggingface.co/RichardErkhov/ifuseok_-_ft-solar-10.7b-v2.1-dpo-gguf/blob/main/ft-solar-10.7b-v2.1-dpo.Q8_0.gguf) | Q8_0 | 10.62GB | |
|
|
|
|
|
|
|
|
|
Original model description: |
|
--- |
|
language: |
|
- ko |
|
pipeline_tag: text-generation |
|
datasets: |
|
- nlpai-lab/databricks-dolly-15k-ko |
|
- kyujinpy/KOR-OpenOrca-Platypus-v3 |
|
- KETI-AIR/kor_boolq |
|
- heegyu/open-korean-instructions |
|
license: cc-by-nc-sa-4.0 |
|
--- |
|
|
|
**Input** Models input text only. |
|
|
|
**Output** Models generate text only. |
|
|
|
**Base Model** [yanolja/KoSOLAR-10.7B-v0.1](https://huggingface.co/yanolja/KoSOLAR-10.7B-v0.1-deprecated) |
|
|
|
**Training Dataset** |
|
- [nlpai-lab/databricks-dolly-15k-ko](https://huggingface.co/datasets/nlpai-lab/databricks-dolly-15k-ko) |
|
- [kyujinpy/KOR-OpenOrca-Platypus-v3](https://huggingface.co/datasets/kyujinpy/KOR-OpenOrca-Platypus-v3) |
|
- [heegyu/open-korean-instructions](heegyu/open-korean-instructions) |
|
- [KETI-AIR/kor_boolq](https://huggingface.co/datasets/KETI-AIR/kor_boolq) |
|
- [AIhub μν λ²μ λ°μ΄ν° μΌλΆ](https://aihub.or.kr/aihubdata/data/view.do?currMenu=115&topMenu=100&dataSetSn=71593) |
|
|
|
# Implementation Code |
|
```python |
|
from transformers import AutoModelForCausalLM, AutoTokenizer |
|
import torch |
|
repo = "ifuseok/sft-solar-10.7b-v2.1-dpo" |
|
OpenOrca = AutoModelForCausalLM.from_pretrained( |
|
repo, |
|
return_dict=True, |
|
torch_dtype=torch.float16, |
|
device_map='auto' |
|
) |
|
OpenOrca_tokenizer = AutoTokenizer.from_pretrained(repo) |
|
``` |
|
|
|
# Prompt Example |
|
``` |
|
### System: |
|
μμ€ν
λ©μμ§ μ
λλ€. |
|
### User: |
|
μ μ μ
λλ€. |
|
### Assistant |
|
μ΄μμ€ν΄νΈ μ
λλ€. |
|
``` |
|
|
|
|