pinzhenchen commited on
Commit
cb87330
1 Parent(s): 893342d

update README

Browse files
Files changed (1) hide show
  1. README.md +9 -9
README.md CHANGED
@@ -1,24 +1,24 @@
1
  ---
2
  language:
3
  - en
4
- - sq
5
  tags:
6
  - translation
7
  license: cc-by-4.0
8
  ---
9
 
10
- ## HPLT MT release v1.0
11
 
12
- This repository contains the translation model for en-sq trained with OPUS and HPLT data. For usage instructions, evaluation scripts, and inference scripts, please refer to the [HPLT-MT-Models v1.0](https://github.com/hplt-project/HPLT-MT-Models/tree/main/v1.0) GitHub repository.
13
 
14
  ### Model Info
15
 
16
  * Source language: English
17
- * Target language: Albanian
18
- * Dataset: OPUS and HPLT data
19
  * Model architecture: Transformer-base
20
  * Tokenizer: SentencePiece (Unigram)
21
- * Cleaning: We used OpusCleaner with a set of basic rules. Details can be found in the filter files in [Github](https://github.com/hplt-project/HPLT-MT-Models/tree/main/v1.0/data/en-sq/raw/v2)
22
 
23
  You can also read our deliverable report [here](https://hplt-project.org/HPLT_D5_1___Translation_models_for_select_language_pairs.pdf) for more details.
24
 
@@ -29,12 +29,12 @@ The model has been trained with Marian. To run inference, refer to the [Inferenc
29
 
30
  The model can be used with the Hugging Face framework if the weights are converted to the Hugging Face format. We might provide this in the future; contributions are also welcome.
31
 
32
- ### Benchmarks
33
 
34
  | testset | BLEU | chrF++ | COMET22 |
35
  | -------------------------------------- | ---- | ----- | ----- |
36
- | flores200 | 30.7 | 56.6 | 0.8761 |
37
- | ntrex | 32.7 | 56.1 | 0.8517 |
38
 
39
  ### Acknowledgements
40
 
 
1
  ---
2
  language:
3
  - en
4
+ - zh
5
  tags:
6
  - translation
7
  license: cc-by-4.0
8
  ---
9
 
10
+ ### HPLT MT release v1.0
11
 
12
+ This repository contains the translation model for en-zh_hant trained with HPLT data only. For usage instructions, evaluation scripts, and inference scripts, please refer to the [HPLT-MT-Models v1.0](https://github.com/hplt-project/HPLT-MT-Models/tree/main/v1.0) GitHub repository.
13
 
14
  ### Model Info
15
 
16
  * Source language: English
17
+ * Target language: Traditional Chinese
18
+ * Data: HPLT data only
19
  * Model architecture: Transformer-base
20
  * Tokenizer: SentencePiece (Unigram)
21
+ * Cleaning: We used OpusCleaner with a set of basic rules. Details can be found in the filter files in [Github](https://github.com/hplt-project/HPLT-MT-Models/tree/main/v1.0/data/en-zh_hant/raw/v0)
22
 
23
  You can also read our deliverable report [here](https://hplt-project.org/HPLT_D5_1___Translation_models_for_select_language_pairs.pdf) for more details.
24
 
 
29
 
30
  The model can be used with the Hugging Face framework if the weights are converted to the Hugging Face format. We might provide this in the future; contributions are also welcome.
31
 
32
+ ## Benchmarks
33
 
34
  | testset | BLEU | chrF++ | COMET22 |
35
  | -------------------------------------- | ---- | ----- | ----- |
36
+ | flores200 | 25.4 | 18.9 | 0.8017 |
37
+ | ntrex | 21.3 | 21.6 | 0.7492 |
38
 
39
  ### Acknowledgements
40