🤗 Language model initialized from mT5 and trained for an additional 100K steps on the Prefix LM objective using mC4 data.
Paper: Overcoming Catastrophic Forgetting in Zero-Shot Cross-Lingual Generation
Authors: Tu Vu, Aditya Barua, Brian Lester, Daniel Cer, Mohit Iyyer, Noah Constant
PyTorch port of the original Flax checkpoint at Google/T5X repository.
- Downloads last month
- 105
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.