Edit model card

T5-Efficient-MINI for grammar correction

This is a T5-Efficient-MINI model that was trained on a subset of C4_200M dataset to solve the grammar correction task in English. To bring additional errors, random typos were introduced to the input sentences using the nlpaug library. Since the model was trained on only one task, there are no prefixes needed.

The model was trained as a part of the project during the Full Stack Deep Learning course. ONNX version of the model is deployed on the site and can be run directly in the browser.

Downloads last month
154
Inference API
This model can be loaded on Inference API (serverless).