This is a variant of the google/mt5-base model, in which Ukrainian and 9% English words remain. This model has 252M parameters - 43% of the original size. Special thanks for the practical example and inspiration: cointegrated

Citing & Authors

@misc{Uaritm,
      title={SetFit: Classification of medical texts}, 
      author={Vitaliy Ostashko},
      year={2022},
      url={https://esemi.org}
}
Downloads last month
35
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.