Update checkpoint for transformers>=4.29
#3
by
ArthurZ
HF staff
- opened
Following the merge of a PR in transformers
it appeared that this model was not properly converted. This PR will fix the inference and was tested using the following script:
>>> from transformers import AutoTokenizer, MarianMTModel
>>> tokenizer = AutoTokenizer.from_pretrained('Helsinki-NLP/opus-mt-tc-base-uk-ro')
>>> inputs = tokenizer("Гей!Давайте вчитися разом", return_tensors="pt", padding=True)
>>> model = MarianMTModel.from_pretrained('Helsinki-NLP/opus-mt-tc-base-uk-ro')
>>> print(tokenizer.batch_decode(model.generate(**inputs)))
['<pad> A.A.A.A.A.A.A.A.A.A.A.A.A.A.A.A.A.A.</s>']
Automatically merging the PR.
ArthurZ
changed pull request status to
merged