JuhaoLiang commited on
Commit
c5971db
โ€ข
1 Parent(s): 0fa524b

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -15,7 +15,7 @@ Arabic language domain. This is the repository for the version 1.5 of 13B-chat p
15
  ## Model Details
16
  We have released the AceGPT family of large language models, which is a collection of fully fine-tuned generative text models based on LlaMA2, ranging from 7B to 13B parameters. Our models include two main categories: AceGPT and AceGPT-chat. AceGPT-chat is an optimized version specifically designed for dialogue applications. It is worth mentioning that our models have demonstrated superior performance compared to all currently available open-source Arabic dialogue models in multiple benchmark tests. Furthermore, in our human evaluations, our models have shown comparable satisfaction levels to some closed-source models, such as ChatGPT, in the Arabic language.
17
  ## Model Developers
18
- We are from the Chinese University of Hong Kong, Shenzhen (CUHKSZ), the Shenzhen Research Institute of Big Data (SRIBD), the King Abdullah University of Science and Technology (KAUST), and King AbdulAziz University (KAU).
19
  ## Variations
20
  AceGPT families come in a range of parameter sizes โ€”โ€” 7B and 13B, each size of model has a base category and a -chat category.
21
  ## Paper
 
15
  ## Model Details
16
  We have released the AceGPT family of large language models, which is a collection of fully fine-tuned generative text models based on LlaMA2, ranging from 7B to 13B parameters. Our models include two main categories: AceGPT and AceGPT-chat. AceGPT-chat is an optimized version specifically designed for dialogue applications. It is worth mentioning that our models have demonstrated superior performance compared to all currently available open-source Arabic dialogue models in multiple benchmark tests. Furthermore, in our human evaluations, our models have shown comparable satisfaction levels to some closed-source models, such as ChatGPT, in the Arabic language.
17
  ## Model Developers
18
+ We are from the King Abdullah University of Science and Technology (KAUST), the Chinese University of Hong Kong, Shenzhen (CUHKSZ), the Shenzhen Research Institute of Big Data (SRIBD), and King AbdulAziz University (KAU).
19
  ## Variations
20
  AceGPT families come in a range of parameter sizes โ€”โ€” 7B and 13B, each size of model has a base category and a -chat category.
21
  ## Paper