DistilProtBert / README.md
yarongef's picture
Update README.md
0ac854d
|
raw
history blame
428 Bytes
metadata
license: mit
language: protein
tags:
  - protein language model
datasets:
  - Uniref50

DistilProtBert model

Distilled protein language of ProtBert. In addition to cross entropy and cosine teacher-student losses, DistilProtBert was pretrained on a masked language modeling (MLM) objective and it only works with capital letter amino acids.

Model description