nicholasKluge commited on
Commit
f8157e8
1 Parent(s): 5362b88

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -37,9 +37,9 @@ co2_eq_emissions:
37
 
38
  ## Model Summary
39
 
40
- Given the lack of available monolingual foundational models in non-English languages and the fact that some of the most used and downloaded models by the community are those small enough to allow individual researchers and hobbyists to use them in low-resource environments, we developed the TeenyTinyLlama: _a series of small foundational models trained in Brazilian Portuguese._
41
 
42
- TeenyTinyLlama is a series of compact language models based on the Llama 2 architecture. These models are designed to deliver efficient natural language processing capabilities while being resource-conscious.
43
 
44
  Also, TeenyTinyLlama models were trained by leveraging [scaling laws](https://arxiv.org/abs/2203.15556) to determine the optimal number of tokens per parameter while incorporating [preference pre-training](https://arxiv.org/abs/2112.00861).
45
 
 
37
 
38
  ## Model Summary
39
 
40
+ Given the lack of available monolingual foundational models in non-English languages and the fact that some of the most used and downloaded models by the community are those small enough to allow individual researchers and hobbyists to use them in low-resource environments, we developed the TeenyTinyLlama: _a pair of small foundational models trained in Brazilian Portuguese._
41
 
42
+ TeenyTinyLlama is a pair of compact language models based on the Llama 2 architecture. These models are designed to deliver efficient natural language processing capabilities while being resource-conscious.
43
 
44
  Also, TeenyTinyLlama models were trained by leveraging [scaling laws](https://arxiv.org/abs/2203.15556) to determine the optimal number of tokens per parameter while incorporating [preference pre-training](https://arxiv.org/abs/2112.00861).
45