dotw commited on
Commit
e3491e3
1 Parent(s): 9b98e50

specify base model in readme

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -5,7 +5,7 @@ license: mit
5
 
6
  SEA-LION is a collection of Large Language Models (LLMs) which has been pretrained and instruct-tuned for the Southeast Asia (SEA) region.
7
  The size of the models range from 3 billion to 7 billion parameters.
8
- This is the card for the SEA-LION 7B model.
9
 
10
  SEA-LION stands for <i>Southeast Asian Languages In One Network</i>.
11
 
 
5
 
6
  SEA-LION is a collection of Large Language Models (LLMs) which has been pretrained and instruct-tuned for the Southeast Asia (SEA) region.
7
  The size of the models range from 3 billion to 7 billion parameters.
8
+ This is the card for the SEA-LION 7B base model.
9
 
10
  SEA-LION stands for <i>Southeast Asian Languages In One Network</i>.
11