Update README.md
Browse files
README.md
CHANGED
@@ -11,6 +11,7 @@ datasets:
|
|
11 |
|
12 |
Distilled version of [ProtBert-UniRef100](https://huggingface.co/Rostlab/prot_bert) model.
|
13 |
In addition to cross entropy and cosine teacher-student losses, DistilProtBert was pretrained on a masked language modeling (MLM) objective and it only works with capital letter amino acids.
|
|
|
14 |
[Git](https://github.com/yarongef/DistilProtBert) repository.
|
15 |
|
16 |
# Model details
|
|
|
11 |
|
12 |
Distilled version of [ProtBert-UniRef100](https://huggingface.co/Rostlab/prot_bert) model.
|
13 |
In addition to cross entropy and cosine teacher-student losses, DistilProtBert was pretrained on a masked language modeling (MLM) objective and it only works with capital letter amino acids.
|
14 |
+
|
15 |
[Git](https://github.com/yarongef/DistilProtBert) repository.
|
16 |
|
17 |
# Model details
|