Add library_name and pipeline_tag metadata
Browse filesThis PR improves the model card by adding the `library_name` and `pipeline_tag` metadata. This will allow users to easily find and use the model, and ensures the model can be found at https://huggingface.co/models?pipeline_tag=text-generation&sort=trending.
README.md
CHANGED
@@ -1,4 +1,6 @@
|
|
1 |
---
|
|
|
|
|
2 |
language:
|
3 |
- en
|
4 |
- zh
|
@@ -12,6 +14,8 @@ language:
|
|
12 |
- km
|
13 |
- su
|
14 |
- tl
|
|
|
|
|
15 |
tags:
|
16 |
- multilingual
|
17 |
- sea
|
@@ -29,18 +33,16 @@ widget:
|
|
29 |
example_title: Indonesian
|
30 |
- text: Làm thế nào để nướng cá?
|
31 |
example_title: Vietnamese
|
32 |
-
|
33 |
-
base_model:
|
34 |
-
- Qwen/Qwen2.5-0.5B
|
35 |
---
|
36 |
|
|
|
37 |
<div align="center">
|
38 |
<img src="sailor2_banner.jpg" width="700"/>
|
39 |
</div>
|
40 |
|
41 |
> The logo was generated by MidJourney
|
42 |
|
43 |
-
|
44 |
Sailor2 is a community-driven initiative that brings cutting-edge multilingual language models to South-East Asia (SEA).
|
45 |
Our research highlights a strong demand for models in the **8B and 20B parameter** range for production use, alongside **1B models** for specialized applications,
|
46 |
such as speculative decoding and research purposes.
|
@@ -58,7 +60,6 @@ The Sailor2 model comes in three sizes, 1B, 8B, and 20B, which are **expanded fr
|
|
58 |
- **Codebase:** [github.com/sail-sg/sailor2](https://github.com/sail-sg/sailor2)
|
59 |
- **Technical Report:** [Sailor2 Report](https://arxiv.org/pdf/2502.12982)
|
60 |
|
61 |
-
|
62 |
## Training details
|
63 |
|
64 |
During development, we employ a range of advanced technologies to ensure top-tier performance and efficiency:
|
@@ -70,7 +71,6 @@ During development, we employ a range of advanced technologies to ensure top-tie
|
|
70 |
|
71 |
Please refer to [Sailor2 Blog](https://sea-sailor.github.io/blog/sailor2/) for more training details.
|
72 |
|
73 |
-
|
74 |
## Requirements
|
75 |
The code of Sailor2 has been in the latest Hugging face transformers and we advise you to install `transformers==4.46.3`.
|
76 |
|
@@ -145,4 +145,5 @@ If you find Sailor2 useful, please cite our work as follows:
|
|
145 |
|
146 |
# Contact Us
|
147 |
|
148 |
-
If you have any questions, please raise an issue or contact us at [doulx@sea.com](mailto:doulx@sea.com) or [liuqian.sea@gmail.com](mailto:liuqian.sea@gmail.com).
|
|
|
|
1 |
---
|
2 |
+
base_model:
|
3 |
+
- Qwen/Qwen2.5-0.5B
|
4 |
language:
|
5 |
- en
|
6 |
- zh
|
|
|
14 |
- km
|
15 |
- su
|
16 |
- tl
|
17 |
+
license: apache-2.0
|
18 |
+
library_name: transformers
|
19 |
tags:
|
20 |
- multilingual
|
21 |
- sea
|
|
|
33 |
example_title: Indonesian
|
34 |
- text: Làm thế nào để nướng cá?
|
35 |
example_title: Vietnamese
|
36 |
+
pipeline_tag: text-generation
|
|
|
|
|
37 |
---
|
38 |
|
39 |
+
```markdown
|
40 |
<div align="center">
|
41 |
<img src="sailor2_banner.jpg" width="700"/>
|
42 |
</div>
|
43 |
|
44 |
> The logo was generated by MidJourney
|
45 |
|
|
|
46 |
Sailor2 is a community-driven initiative that brings cutting-edge multilingual language models to South-East Asia (SEA).
|
47 |
Our research highlights a strong demand for models in the **8B and 20B parameter** range for production use, alongside **1B models** for specialized applications,
|
48 |
such as speculative decoding and research purposes.
|
|
|
60 |
- **Codebase:** [github.com/sail-sg/sailor2](https://github.com/sail-sg/sailor2)
|
61 |
- **Technical Report:** [Sailor2 Report](https://arxiv.org/pdf/2502.12982)
|
62 |
|
|
|
63 |
## Training details
|
64 |
|
65 |
During development, we employ a range of advanced technologies to ensure top-tier performance and efficiency:
|
|
|
71 |
|
72 |
Please refer to [Sailor2 Blog](https://sea-sailor.github.io/blog/sailor2/) for more training details.
|
73 |
|
|
|
74 |
## Requirements
|
75 |
The code of Sailor2 has been in the latest Hugging face transformers and we advise you to install `transformers==4.46.3`.
|
76 |
|
|
|
145 |
|
146 |
# Contact Us
|
147 |
|
148 |
+
If you have any questions, please raise an issue or contact us at [doulx@sea.com](mailto:doulx@sea.com) or [liuqian.sea@gmail.com](mailto:liuqian.sea@gmail.com).
|
149 |
+
```
|