yarongef commited on
Commit
e42b656
1 Parent(s): 10d995a

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -17,9 +17,9 @@ In addition to cross entropy and cosine teacher-student losses, DistilProtBert w
17
  DistilProtBert was pretrained on millions of proteins sequences.
18
 
19
  Few important differences between DistilProtBert model and the original ProtBert version are:
20
- 1. The size of the model
21
- 2. The size of the pretraining dataset
22
- 3. Time & hardware used for pretraining
23
 
24
  ## Intended uses & limitations
25
 
 
17
  DistilProtBert was pretrained on millions of proteins sequences.
18
 
19
  Few important differences between DistilProtBert model and the original ProtBert version are:
20
+ 1. Size of the model
21
+ 2. Size of the pretraining dataset
22
+ 3. Hardware used for pretraining
23
 
24
  ## Intended uses & limitations
25