AudreyVM commited on
Commit
8c327f4
·
verified ·
1 Parent(s): c5123e7

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -1
README.md CHANGED
@@ -725,13 +725,15 @@ NLLB-3.3 ([Costa-jussà et al., 2022](https://arxiv.org/abs/2207.04672)) and [Sa
725
 
726
  </details>
727
 
728
- <details>
729
  ### Gender Aware Translation
730
 
731
  Below are the evaluation results for gender aware translation evaluated on the [MT-GenEval](https://github.com/amazon-science/machine-translation-gender-eval?tab=readme-ov-file#mt-geneval) dataset ([Currey, A. et al.](https://github.com/amazon-science/machine-translation-gender-eval?tab=readme-ov-file#mt-geneval)).
732
  These have been calculated for translation from English into German, Spanish, French, Italian, Portuguese and Russian and are compared against MADLAD400-7B, TowerInstruct-7B-v0.2 and the SalamandraTA-7b-base model.
733
  Evaluation was conducted using MT-Lens and is reported as accuracy computed using the accuracy metric provided with MT-GenEval.
734
 
 
 
735
  | | Source | Target | Masc | Fem | Pair |
736
  |:---------------------------------|:---------|:---------|-------:|-------:|-------:|
737
  | SalamandraTA-7b-instruct | en | de | **0.8833333333333333** | **0.8833333333333333** | **0.7733333333333333** |
 
725
 
726
  </details>
727
 
728
+
729
  ### Gender Aware Translation
730
 
731
  Below are the evaluation results for gender aware translation evaluated on the [MT-GenEval](https://github.com/amazon-science/machine-translation-gender-eval?tab=readme-ov-file#mt-geneval) dataset ([Currey, A. et al.](https://github.com/amazon-science/machine-translation-gender-eval?tab=readme-ov-file#mt-geneval)).
732
  These have been calculated for translation from English into German, Spanish, French, Italian, Portuguese and Russian and are compared against MADLAD400-7B, TowerInstruct-7B-v0.2 and the SalamandraTA-7b-base model.
733
  Evaluation was conducted using MT-Lens and is reported as accuracy computed using the accuracy metric provided with MT-GenEval.
734
 
735
+ <details>
736
+
737
  | | Source | Target | Masc | Fem | Pair |
738
  |:---------------------------------|:---------|:---------|-------:|-------:|-------:|
739
  | SalamandraTA-7b-instruct | en | de | **0.8833333333333333** | **0.8833333333333333** | **0.7733333333333333** |