ruSciBERT

Model was trained by Sber AI team and MLSA Lab of Institute for AI, MSU. If you use our model for your project, please tell us about it (nikgerasimenko@gmail.com).

Presentation at the AI Journey 2022

  • Task: mask filling
  • Type: encoder
  • Tokenizer: bpe
  • Dict size: 50265
  • Num Parameters: 123 M
  • Training Data Volume: 6.5 GB
Downloads last month
265
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.