Update README.md
Browse files
README.md
CHANGED
@@ -1,6 +1,6 @@
|
|
1 |
---
|
2 |
library_name: transformers
|
3 |
-
license: cc-by-nc-
|
4 |
language:
|
5 |
- ko
|
6 |
pipeline_tag: text-generation
|
@@ -18,23 +18,12 @@ pipeline_tag: text-generation
|
|
18 |
|
19 |
<!-- Provide a longer summary of what this model is. -->
|
20 |
|
21 |
-
|
22 |
|
23 |
-
- **Developed by:**
|
24 |
-
- **
|
25 |
-
- **
|
26 |
-
- **
|
27 |
-
- **Language(s) (NLP):** [More Information Needed]
|
28 |
-
- **License:** [More Information Needed]
|
29 |
-
- **Finetuned from model [optional]:** [More Information Needed]
|
30 |
-
|
31 |
-
### Model Sources [optional]
|
32 |
-
|
33 |
-
<!-- Provide the basic links for the model. -->
|
34 |
-
|
35 |
-
- **Repository:** [More Information Needed]
|
36 |
-
- **Paper [optional]:** [More Information Needed]
|
37 |
-
- **Demo [optional]:** [More Information Needed]
|
38 |
|
39 |
## Uses
|
40 |
|
@@ -44,35 +33,35 @@ This is the model card of a 🤗 transformers model that has been pushed on the
|
|
44 |
|
45 |
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
|
46 |
|
47 |
-
|
48 |
|
49 |
### Downstream Use [optional]
|
50 |
|
51 |
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
|
52 |
|
53 |
-
|
54 |
|
55 |
### Out-of-Scope Use
|
56 |
|
57 |
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
|
58 |
|
59 |
-
|
60 |
|
61 |
## Bias, Risks, and Limitations
|
62 |
|
63 |
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
|
64 |
|
65 |
-
|
66 |
|
67 |
### Recommendations
|
68 |
|
69 |
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
|
70 |
|
71 |
-
|
72 |
|
73 |
## How to Get Started with the Model
|
74 |
|
75 |
-
|
76 |
|
77 |
[More Information Needed]
|
78 |
|
@@ -191,7 +180,8 @@ Carbon emissions can be estimated using the [Machine Learning Impact calculator]
|
|
191 |
|
192 |
## More Information [optional]
|
193 |
|
194 |
-
|
|
|
195 |
|
196 |
## Model Card Authors [optional]
|
197 |
|
|
|
1 |
---
|
2 |
library_name: transformers
|
3 |
+
license: cc-by-nc-4.0
|
4 |
language:
|
5 |
- ko
|
6 |
pipeline_tag: text-generation
|
|
|
18 |
|
19 |
<!-- Provide a longer summary of what this model is. -->
|
20 |
|
21 |
+
POLAR is a Korean LLM developed by Plateer's AI-lab. It was inspired by Upstage's SOLAR and is under the same license(cc-by-nc-4.0). We will continue to evolve this model and hope to contribute to the Korean LLM ecosystem.
|
22 |
|
23 |
+
- **Developed by:** Plateer's AI-lab
|
24 |
+
- **Model type:** Transformer
|
25 |
+
- **Language(s) (NLP):** Korean
|
26 |
+
- **License:** cc-by-nc-4.0
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
27 |
|
28 |
## Uses
|
29 |
|
|
|
33 |
|
34 |
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
|
35 |
|
36 |
+
|
37 |
|
38 |
### Downstream Use [optional]
|
39 |
|
40 |
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
|
41 |
|
42 |
+
|
43 |
|
44 |
### Out-of-Scope Use
|
45 |
|
46 |
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
|
47 |
|
48 |
+
|
49 |
|
50 |
## Bias, Risks, and Limitations
|
51 |
|
52 |
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
|
53 |
|
54 |
+
|
55 |
|
56 |
### Recommendations
|
57 |
|
58 |
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
|
59 |
|
60 |
+
|
61 |
|
62 |
## How to Get Started with the Model
|
63 |
|
64 |
+
|
65 |
|
66 |
[More Information Needed]
|
67 |
|
|
|
180 |
|
181 |
## More Information [optional]
|
182 |
|
183 |
+
If you would like more information about our company, please visit the link below.
|
184 |
+
[tech.x2bee.com](tech.x2bee.com)
|
185 |
|
186 |
## Model Card Authors [optional]
|
187 |
|