pinzhenchen commited on
Commit
b4b3edc
1 Parent(s): 88e7674

update README

Browse files
Files changed (1) hide show
  1. README.md +25 -11
README.md CHANGED
@@ -1,29 +1,43 @@
1
  ---
 
 
 
2
  tags:
3
  - translation
4
  license: cc-by-4.0
5
- language:
6
- - en
7
- - zh
8
  ---
9
 
10
- ### Translation model for en-zh_hant HPLT v1.0
 
 
11
 
12
- This repository contains the model weights for translation models trained with Marian for HPLT project. For usage instructions, evaluation scripts, and inference scripts, please refer to the [HPLT-MT-Models v1.0](https://github.com/hplt-project/HPLT-MT-Models/tree/main/v1.0) GitHub repository.
13
 
14
  * Source language: en
15
  * Target language: zh_hant
16
- * Dataset: HPLT only
17
- * Model: transformer-base
18
  * Tokenizer: SentencePiece (Unigram)
19
- * Cleaning: We use OpusCleaner for cleaning the corpus. Details about rules used can be found in the filter files in [Github](https://github.com/hplt-project/HPLT-MT-Models/tree/main/v1.0/data/zh_hant-en/raw/v0)
 
 
 
 
20
 
21
- To run inference with Marian, refer to the [Inference/Decoding/Translation](https://github.com/hplt-project/HPLT-MT-Models/tree/main/v1.0#inferencedecodingtranslation) section of our GitHub repository.
22
 
 
 
 
23
 
24
  ## Benchmarks
25
 
26
- | testset | BLEU | chr-F | COMET-22 |
27
  | -------------------------------------- | ---- | ----- | ----- |
28
  | flores200 | 25.4 | 18.9 | 0.8017 |
29
- | ntrex | 21.3 | 21.6 | 0.7492 |
 
 
 
 
 
 
 
1
  ---
2
+ language:
3
+ - en
4
+ - zh
5
  tags:
6
  - translation
7
  license: cc-by-4.0
 
 
 
8
  ---
9
 
10
+ ### HPLT MT release v1.0
11
+
12
+ This repository contains the translation model for en-zh_hant trained with HPLT data only. For usage instructions, evaluation scripts, and inference scripts, please refer to the [HPLT-MT-Models v1.0](https://github.com/hplt-project/HPLT-MT-Models/tree/main/v1.0) GitHub repository.
13
 
14
+ ### Model Info
15
 
16
  * Source language: en
17
  * Target language: zh_hant
18
+ * Data: HPLT data only
19
+ * Model architecture: Transformer-base
20
  * Tokenizer: SentencePiece (Unigram)
21
+ * Cleaning: We used OpusCleaner with a set of basic rules. Details can be found in the filter files in [Github](https://github.com/hplt-project/HPLT-MT-Models/tree/main/v1.0/data/en-zh_hant/raw/v0)
22
+
23
+ You can also read our deliverable report [here](https://hplt-project.org/HPLT_D5_1___Translation_models_for_select_language_pairs.pdf) for more details.
24
+
25
+ ### Usage
26
 
 
27
 
28
+ The model has been trained with Marian. To run inference, refer to the [Inference/Decoding/Translation](https://github.com/hplt-project/HPLT-MT-Models/tree/main/v1.0#inferencedecodingtranslation) section of our GitHub repository.
29
+
30
+ The model can be used with the Hugging Face framework if the weights are converted to the Hugging Face format. We might provide this in the future; contributions are also welcome.
31
 
32
  ## Benchmarks
33
 
34
+ | testset | BLEU | chrF++ | COMET22 |
35
  | -------------------------------------- | ---- | ----- | ----- |
36
  | flores200 | 25.4 | 18.9 | 0.8017 |
37
+ | ntrex | 21.3 | 21.6 | 0.7492 |
38
+
39
+ ### Acknowledgements
40
+
41
+ This project has received funding from the European Union's Horizon Europe research and innovation programme under grant agreement No 101070350 and from UK Research and Innovation (UKRI) under the UK government's Horizon Europe funding guarantee [grant number 10052546]
42
+
43
+ Brought to you by researchers from the University of Edinburgh, Charles University in Prague, and the whole HPLT consortium.