Update README.md
Browse files
README.md
CHANGED
@@ -1,3 +1,29 @@
|
|
1 |
---
|
|
|
2 |
license: mit
|
|
|
|
|
|
|
3 |
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
---
|
2 |
+
language: ko
|
3 |
license: mit
|
4 |
+
tags:
|
5 |
+
- bart
|
6 |
+
- grammar
|
7 |
---
|
8 |
+
|
9 |
+
# kogrammar-tiny-distil
|
10 |
+
|
11 |
+
Dataset: 국립국어원 맞춤법 교정 말뭉치
|
12 |
+
<br>
|
13 |
+
<br>
|
14 |
+
**Backbone Model**:
|
15 |
+
- [kobart-base-v2](https://huggingface.co/gogamza/kobart-base-v2/blob/main/README.md)
|
16 |
+
- [kogrammar-base](https://huggingface.co/theSOL1/kogrammar-base)
|
17 |
+
|
18 |
+
**GitHub Repo**:
|
19 |
+
- [SOL1archive/KoGrammar](https://github.com/SOL1archive/KoGrammar)
|
20 |
+
|
21 |
+
## Train Method
|
22 |
+
전체 데이터셋 중 약 67.5%를 학습데이터로 활용하여 학습함.
|
23 |
+
<br>
|
24 |
+
SFT Distillation을 이용해 [kogrammar-base](https://huggingface.co/theSOL1/kogrammar-base) 모델의 Decoder Layer를 6개에서 1개로 줄여 다시 학습시킴.
|
25 |
+
|
26 |
+
## Metric
|
27 |
+
|BLEU-2|ROUGE-2 F1|
|
28 |
+
|-|-|
|
29 |
+
|77.8 %|55.0 %|
|