jarodrigues commited on
Commit
ebe2468
1 Parent(s): b1dd777

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -73,7 +73,7 @@ Please use the above cannonical reference when using or citing this model.
73
 
74
  # Model Description
75
 
76
- **This model card is for Gervásio 7B PTBR**, with 7 billion parameters, a hidden size of 4096 units, an intermediate size of 11,008 units, 32 attention heads, 32 hidden layers, and a tokenizer obtained using the Byte-Pair Encoding (BPE) algorithm implemented with SentencePiece, featuring a vocabulary size of 32,000.
77
 
78
  Gervásio-7B-PTBR-Decoder is distributed under an [MIT license](https://huggingface.co/PORTULAN/gervasio-7b-portuguese-ptpt-decoder/blob/main/LICENSE).
79
 
 
73
 
74
  # Model Description
75
 
76
+ **This model card is for Gervásio 7B PTBR**, with 7 billion parameters, a hidden size of 4,096 units, an intermediate size of 11,008 units, 32 attention heads, 32 hidden layers, and a tokenizer obtained using the Byte-Pair Encoding (BPE) algorithm implemented with SentencePiece, featuring a vocabulary size of 32,000.
77
 
78
  Gervásio-7B-PTBR-Decoder is distributed under an [MIT license](https://huggingface.co/PORTULAN/gervasio-7b-portuguese-ptpt-decoder/blob/main/LICENSE).
79