File size: 2,908 Bytes
c6d88b5
2e08c3e
 
 
 
 
 
 
 
 
 
 
 
 
c6d88b5
 
ea0fa7a
c6d88b5
 
2e08c3e
 
c6d88b5
 
b89232f
 
 
 
 
2e08c3e
c6d88b5
2e08c3e
 
c6d88b5
2e08c3e
c6d88b5
 
3ccd44d
 
b74a222
3ccd44d
 
 
 
c8a98b5
3ccd44d
c8a98b5
3ccd44d
209b12c
3ccd44d
2e08c3e
c6d88b5
2e08c3e
 
c6d88b5
ebb6984
 
c6d88b5
dcb1e2b
c6d88b5
2e08c3e
 
 
c6d88b5
3ccd44d
2e14b44
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
---
datasets:
- maywell/ko_wikidata_QA
- nlpai-lab/kullm-v2
- heegyu/kowikitext
- MarkrAI/KoCommercial-Dataset
- heegyu/CoT-collection-ko
- HAERAE-HUB/Korean-Human-Judgements
- instructkr/ko_elo_arena_0207
- HAERAE-HUB/K2-Feedback
- heegyu/open-korean-instructions
- heegyu/aulm-0809
language:
- ko
---

# llama_with_eeve_new_03_150m


## Model Info
 llama μ•„ν‚€ν…μ²˜μ™€ eeve ν† ν¬λ‚˜μ΄μ €λ₯Ό μ‚¬μš©ν•΄ 랜덀 κ°€μ€‘μΉ˜μ—μ„œ μ‹œμž‘ν•΄ μ‚¬μ „ν•™μŠ΅λœ λͺ¨λΈμž…λ‹ˆλ‹€



![image/png](https://cdn-uploads.huggingface.co/production/uploads/642c4af1ab0cc792e4373b57/YLVqolNvfcY-D_ZSmXFLN.png)



λ‹€μŒ μ‹œμŠ€ν…œ ν”„λ‘¬ν”„νŠΈκ°€ 주어진 μƒνƒœλ‘œ ν•™μŠ΅ν•˜μ˜€μŠ΅λ‹ˆλ‹€(λͺ¨λΈ μ‚¬μš© μ‹œ ν”„λ‘¬ν”„νŠΈλ₯Ό 포함해야 ν•©λ‹ˆλ‹€).

'''### System:\n당신은 λΉ„λ„λ•μ μ΄κ±°λ‚˜, μ„±μ μ΄κ±°λ‚˜, λΆˆλ²•μ μ΄κ±°λ‚˜ λ˜λŠ” μ‚¬νšŒ ν†΅λ…μ μœΌλ‘œ ν—ˆμš©λ˜μ§€ μ•ŠλŠ” λ°œμ–Έμ€ ν•˜μ§€ μ•ŠμŠ΅λ‹ˆλ‹€.
μ‚¬μš©μžμ™€ 즐겁게 λŒ€ν™”ν•˜λ©°, μ‚¬μš©μžμ˜ 응닡에 κ°€λŠ₯ν•œ μ •ν™•ν•˜κ³  μΉœμ ˆν•˜κ²Œ μ‘λ‹΅ν•¨μœΌλ‘œμ¨ μ΅œλŒ€ν•œ 도와주렀고 λ…Έλ ₯ν•©λ‹ˆλ‹€.

\n\n### User:\n {question}'''


### Evaluation results

llm as a judge λ°©μ‹μœΌλ‘œ 평가λ₯Ό μ§„ν–‰ν–ˆμŠ΅λ‹ˆλ‹€.
μžμ„Έν•œ λ‚΄μš©μ€ " "λ₯Ό μ°Έκ³ ν•΄μ£Όμ„Έμš”

| Model                                                                                                   | params | Fluency | Coherence | Accuracy | Completeness |
|---------------------------------------------------------------------------------------------------------|--------|---------|-----------|----------|--------------|
| **[kikikara/llama_with_eeve_new_03_150m](https://huggingface.co/kikikara/llama_with_eeve_new_03_150m)(this)** | **0.15B**  | **63.12%**  |   **37.18%**  |  **23.75%**  |    **23.75%**  |
| [EleutherAI/polyglot-ko-1.3b](https://huggingface.co/EleutherAI/polyglot-ko-1.3b)                       | 1.3B   | 51.25%  |   40.31%  |  34.68%  |    32.5%     |
| [EleutherAI/polyglot-ko-5.8b](https://huggingface.co/EleutherAI/polyglot-ko-5.8b)                       | 5.8B   | 54.37%  |   40.62%  |  41.25%  |     35%      |



### How to use

```python
from transformers import AutoTokenizer, AutoModelForCausalLM, pipeline

tokenizer = AutoTokenizer.from_pretrained("kikikara/llama_with_eeve_new_03_150m")
model = AutoModelForCausalLM.from_pretrained("kikikara/llama_with_eeve_new_03_150m")

question = "λ„ˆλŠ” λˆ„κ΅¬μ•Ό?"

prompt = f"### System:\n당신은 λΉ„λ„λ•μ μ΄κ±°λ‚˜, μ„±μ μ΄κ±°λ‚˜, λΆˆλ²•μ μ΄κ±°λ‚˜ λ˜λŠ” μ‚¬νšŒ ν†΅λ…μ μœΌλ‘œ ν—ˆμš©λ˜μ§€ μ•ŠλŠ” λ°œμ–Έμ€ ν•˜μ§€ μ•ŠμŠ΅λ‹ˆλ‹€.\nμ‚¬μš©μžμ™€ 즐겁게 λŒ€ν™”ν•˜λ©°, μ‚¬μš©μžμ˜ 응닡에 κ°€λŠ₯ν•œ μ •ν™•ν•˜κ³  μΉœμ ˆν•˜κ²Œ μ‘λ‹΅ν•¨μœΌλ‘œμ¨ μ΅œλŒ€ν•œ 도와주렀고 λ…Έλ ₯ν•©λ‹ˆλ‹€.\n\n\n### User:\n {question}"
pipe = pipeline(task="text-generation", model=model, tokenizer=tokenizer, max_length=400, repetition_penalty=1.12)
result = pipe(prompt)

print(result[0]['generated_text'])```