specify base model in readme
Browse files
README.md
CHANGED
@@ -5,7 +5,7 @@ license: mit
|
|
5 |
|
6 |
SEA-LION is a collection of Large Language Models (LLMs) which has been pretrained and instruct-tuned for the Southeast Asia (SEA) region.
|
7 |
The size of the models range from 3 billion to 7 billion parameters.
|
8 |
-
This is the card for the SEA-LION 7B model.
|
9 |
|
10 |
SEA-LION stands for <i>Southeast Asian Languages In One Network</i>.
|
11 |
|
|
|
5 |
|
6 |
SEA-LION is a collection of Large Language Models (LLMs) which has been pretrained and instruct-tuned for the Southeast Asia (SEA) region.
|
7 |
The size of the models range from 3 billion to 7 billion parameters.
|
8 |
+
This is the card for the SEA-LION 7B base model.
|
9 |
|
10 |
SEA-LION stands for <i>Southeast Asian Languages In One Network</i>.
|
11 |
|