This is a model checkpoint for "Should You Mask 15% in Masked Language Modeling" (code). We use pre layer norm, which is not supported by HuggingFace. To use our model, go to our github repo, download our code, and import the RoBERTa class from huggingface/modeling_roberta_prelayernorm.py
. For example,
from huggingface.modeling_roberta_prelayernorm import RobertaForMaskedLM, RobertaForSequenceClassification
- Downloads last month
- 7