Update README.md
Browse files
README.md
CHANGED
@@ -1,6 +1,5 @@
|
|
1 |
---
|
2 |
license: mit
|
3 |
-
language: protein
|
4 |
tags:
|
5 |
- protein language model
|
6 |
datasets:
|
@@ -12,7 +11,7 @@ datasets:
|
|
12 |
A distilled version of [ProtBert-UniRef100](https://huggingface.co/Rostlab/prot_bert) model.
|
13 |
In addition to cross entropy and cosine teacher-student losses, DistilProtBert was pretrained on a masked language modeling (MLM) objective and it only works with capital letter amino acids.
|
14 |
|
15 |
-
Check out our paper [DistilProtBert: A distilled protein language model used to distinguish between real proteins and their randomly shuffled counterparts](https://
|
16 |
|
17 |
[Git](https://github.com/yarongef/DistilProtBert) repository.
|
18 |
|
|
|
1 |
---
|
2 |
license: mit
|
|
|
3 |
tags:
|
4 |
- protein language model
|
5 |
datasets:
|
|
|
11 |
A distilled version of [ProtBert-UniRef100](https://huggingface.co/Rostlab/prot_bert) model.
|
12 |
In addition to cross entropy and cosine teacher-student losses, DistilProtBert was pretrained on a masked language modeling (MLM) objective and it only works with capital letter amino acids.
|
13 |
|
14 |
+
Check out our paper [DistilProtBert: A distilled protein language model used to distinguish between real proteins and their randomly shuffled counterparts](https://academic.oup.com/bioinformatics/article-abstract/38/Supplement_2/ii95/6701995?redirectedFrom=fulltext) for more details.
|
15 |
|
16 |
[Git](https://github.com/yarongef/DistilProtBert) repository.
|
17 |
|