Update README.md
Browse files
README.md
CHANGED
@@ -9,7 +9,7 @@ datasets:
|
|
9 |
|
10 |
# DistilProtBert
|
11 |
|
12 |
-
Distilled version of [ProtBert](https://huggingface.co/Rostlab/prot_bert) model.
|
13 |
In addition to cross entropy and cosine teacher-student losses, DistilProtBert was pretrained on a masked language modeling (MLM) objective and it only works with capital letter amino acids.
|
14 |
|
15 |
# Model description
|
@@ -17,9 +17,9 @@ In addition to cross entropy and cosine teacher-student losses, DistilProtBert w
|
|
17 |
DistilProtBert was pretrained on millions of proteins sequences.
|
18 |
|
19 |
Few important differences between DistilProtBert model and the original ProtBert version are:
|
20 |
-
1. Size of the model
|
21 |
-
2. Size of the pretraining dataset
|
22 |
-
3. Hardware used for pretraining
|
23 |
|
24 |
## Intended uses & limitations
|
25 |
|
@@ -51,6 +51,5 @@ When fine-tuned on downstream tasks, this model achieves the following results:
|
|
51 |
| CB513 | 79 | |
|
52 |
| DeepLoc | | 86 |
|
53 |
|
54 |
-
Distinguish between:
|
55 |
|
56 |
### BibTeX entry and citation info
|
|
|
9 |
|
10 |
# DistilProtBert
|
11 |
|
12 |
+
Distilled version of [ProtBert-UniRef100](https://huggingface.co/Rostlab/prot_bert) model.
|
13 |
In addition to cross entropy and cosine teacher-student losses, DistilProtBert was pretrained on a masked language modeling (MLM) objective and it only works with capital letter amino acids.
|
14 |
|
15 |
# Model description
|
|
|
17 |
DistilProtBert was pretrained on millions of proteins sequences.
|
18 |
|
19 |
Few important differences between DistilProtBert model and the original ProtBert version are:
|
20 |
+
1. Size of the model: 230M parameters (ProtBert has 420M parameters)
|
21 |
+
2. Size of the pretraining dataset: ~43M proteins (ProtBert was pretrained on 216M proteins)
|
22 |
+
3. Hardware used for pretraining: five v100 32GB Nvidia GPUs (ProtBert was pretrained on 512 16GB TPUs)
|
23 |
|
24 |
## Intended uses & limitations
|
25 |
|
|
|
51 |
| CB513 | 79 | |
|
52 |
| DeepLoc | | 86 |
|
53 |
|
|
|
54 |
|
55 |
### BibTeX entry and citation info
|