Edit model card

NaturalRoBERTa

This is a pre-trained model of type RoBERTa. NaturalRoBERTa is built on a dataset obtained from open sources: three news sub-corpuses Taiga (Lenta.ru, Interfax, N+1) and Russian Wikipedia texts.

Evaluation

This model was evaluated on RussianSuperGLUE tests:

Task Result Metrics
LiDiRus 0,0 Matthews Correlation Coefficient
RCB 0,217 / 0,484 F1 / Accuracy
PARus 0,498 Accuracy
TERRa 0,487 Accuracy
RUSSE 0,587 Accuracy
RWSD 0,669 Accuracy
Downloads last month
0
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Dataset used to train tay-yozhik/NaturalRoBERTa