File size: 1,838 Bytes
d7b83b2 ec91a4f d7b83b2 2767f79 d7b83b2 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 |
---
license: apache-2.0
language:
- en
- is
library_name: fairseq
tags:
- translation
- wmt
---
## Model description
This is a translation model which translates text from English to Icelandic. It follows the architecture of the transformer model described in [Attention is All You Need](https://arxiv.org/pdf/1706.03762) and was trained with [fairseq](https://github.com/facebookresearch/fairseq) for [WMT24](https://www2.statmt.org/wmt24/).
This is the base version of our model. See also: [wmt24-en-is-transformer-base-deep](https://huggingface.co/arnastofnun/wmt24-en-is-transformer-base-deep), [wmt24-en-is-transformer-big](https://huggingface.co/arnastofnun/wmt24-en-is-transformer-big), [wmt24-en-is-transformer-big-deep](https://huggingface.co/arnastofnun/wmt24-en-is-transformer-big-deep).
| model | d_model | d_ff | h | N_enc | N_dec |
|:---------------|:----------------------|:-------------------|:--------------|:--------------------|:--------------------|
| Base | 512 | 2048 | 8 | 6 | 6 |
| Base_deep | 512 | 2048 | 8 | 36 | 12 |
| Big | 1024 | 4096 | 16 | 6 | 6 |
| Big_deep | 1024 | 4096 | 16 | 36 | 12 |
#### How to use
```python
from fairseq.models.transformer import TransformerModel
TRANSLATION_MODEL_NAME = 'checkpoint_best.pt'
TRANSLATION_MODEL = TransformerModel.from_pretrained('path/to/model', checkpoint_file=TRANSLATION_MODEL_NAME, bpe='sentencepiece', sentencepiece_model='sentencepiece.bpe.model')
src_sentences = ['This is a test sentence.', 'This is another test sentence.']
translated_sentences = translate(translation_model=TRANSLATION_MODEL, sentences=src_sentences, beam=5)
print(translated_sentences)
```
#### Limitations and bias
## Training data
## Eval results
### BibTeX entry and citation info
```bibtex
@inproceedings{...,
year={XXX},
title={XXX},
author={XXX},
booktitle={XXX},
}
``` |