T5-small distilled from T5-flan-base using OpenOrca'a FLAN dataset using Seq2seq methodology. The baseline for Seq2seq distillation on T5 decoder models. It is free under Apache License.

The distilled model is a T5-small model with 60M parameters, not a T5-flan-small model.

Downloads last month
78
Safetensors
Model size
60.5M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Dataset used to train Sayan01/T5-Flan-Small