Update README.md
Browse files
README.md
CHANGED
@@ -38,7 +38,7 @@ DistilProtBert model was pretrained on [Uniref50](https://www.uniprot.org/downlo
|
|
38 |
Preprocessing was done using ProtBert's tokenizer.
|
39 |
The details of the masking procedure for each sequence followed the original Bert (as mentioned in [ProtBert](https://huggingface.co/Rostlab/prot_bert)).
|
40 |
|
41 |
-
The model was pretrained on a single DGX cluster 3 epochs in total. local batch size was 16, the optimizer used was AdamW with a learning rate of 5e-5 and mixed precision settings.
|
42 |
|
43 |
## Evaluation results
|
44 |
|
|
|
38 |
Preprocessing was done using ProtBert's tokenizer.
|
39 |
The details of the masking procedure for each sequence followed the original Bert (as mentioned in [ProtBert](https://huggingface.co/Rostlab/prot_bert)).
|
40 |
|
41 |
+
The model was pretrained on a single DGX cluster for 3 epochs in total. local batch size was 16, the optimizer used was AdamW with a learning rate of 5e-5 and mixed precision settings.
|
42 |
|
43 |
## Evaluation results
|
44 |
|