Description

Adaptation of the flan-t5-small weights to make it compatible with the FAT5 framework (Flash Attention T5).
This adaptation should enable the user to efficiently continue the pre-training of the flan-t5 to adapt it to more recent data, or to specialize it in a specific domain, for example.

Usage

from transformers import AutoModel, AutoTokenizer
model = AutoModel.from_pretrained("CATIE-AQ/FAT5-small-flan-en", trust_remote_code=True)
tokenizer = AutoTokenizer.from_pretrained("CATIE-AQ/FAT5-small-flan-en")
Downloads last month
118
Safetensors
Model size
77M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The HF Inference API does not support model that require custom code execution.

Collection including CATIE-AQ/FAT5-small-flan-en