byt5-base-wikisplit / convert_to_pytorch.py
bhadresh-savani's picture
Added pytorch model
0c24d02
raw
history blame
142 Bytes
from transformers import AutoModelForSeq2SeqLM
model = AutoModelForSeq2SeqLM.from_pretrained("./", from_flax=True)
model.save_pretrained("./")