The aim is to compress the mT5-base model to leave only the Ukrainian language and some basic English.

Reproduced the similar result (but with another language) from this medium article.

Results:

  • 582M params -> 244M params (58%)
  • 250K tokens -> 30K tokens
  • 2.2GB size model -> 0.95GB size model
Downloads last month
118
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for kravchenko/uk-mt5-base

Finetunes
4 models