Update README.md
Browse files
README.md
CHANGED
@@ -9,7 +9,7 @@ datasets:
|
|
9 |
|
10 |
# DistilProtBert
|
11 |
|
12 |
-
A
|
13 |
In addition to cross entropy and cosine teacher-student losses, DistilProtBert was pretrained on a masked language modeling (MLM) objective and it only works with capital letter amino acids.
|
14 |
|
15 |
Check out our paper [DistilProtBert: A distilled protein language model used to distinguish between real proteins and their randomly shuffled counterparts](https://www.biorxiv.org/content/10.1101/2022.05.09.491157v1) for more details.
|
|
|
9 |
|
10 |
# DistilProtBert
|
11 |
|
12 |
+
A distilled version of [ProtBert-UniRef100](https://huggingface.co/Rostlab/prot_bert) model.
|
13 |
In addition to cross entropy and cosine teacher-student losses, DistilProtBert was pretrained on a masked language modeling (MLM) objective and it only works with capital letter amino acids.
|
14 |
|
15 |
Check out our paper [DistilProtBert: A distilled protein language model used to distinguish between real proteins and their randomly shuffled counterparts](https://www.biorxiv.org/content/10.1101/2022.05.09.491157v1) for more details.
|