lbourdois commited on
Commit
ab24490
1 Parent(s): e06bf4c

Add multilingual to the language tag

Browse files

Hi! A PR to add multilingual to the language tag to improve the referencing.

Files changed (1) hide show
  1. README.md +486 -1055
README.md CHANGED
@@ -8,1215 +8,646 @@ language:
8
  - lb
9
  - nds
10
  - nl
11
-
 
12
  tags:
13
  - translation
14
  - opus-mt-tc
15
-
16
- license: cc-by-4.0
17
  model-index:
18
  - name: opus-mt-tc-big-gmw-gmw
19
  results:
20
  - task:
21
- name: Translation afr-deu
22
  type: translation
23
- args: afr-deu
24
  dataset:
25
  name: flores101-devtest
26
  type: flores_101
27
  args: afr deu devtest
28
  metrics:
29
- - name: BLEU
30
- type: bleu
31
- value: 30.2
32
- - name: chr-F
33
- type: chrf
34
- value: 0.58718
35
- - task:
36
- name: Translation afr-eng
37
- type: translation
38
- args: afr-eng
39
- dataset:
40
- name: flores101-devtest
41
- type: flores_101
42
- args: afr eng devtest
43
- metrics:
44
- - name: BLEU
45
- type: bleu
46
- value: 55.1
47
- - name: chr-F
48
- type: chrf
49
- value: 0.74826
50
- - task:
51
- name: Translation afr-ltz
52
- type: translation
53
- args: afr-ltz
54
- dataset:
55
- name: flores101-devtest
56
- type: flores_101
57
- args: afr ltz devtest
58
- metrics:
59
- - name: BLEU
60
- type: bleu
61
- value: 15.7
62
- - name: chr-F
63
- type: chrf
64
- value: 0.46826
65
- - task:
66
- name: Translation afr-nld
67
- type: translation
68
- args: afr-nld
69
- dataset:
70
- name: flores101-devtest
71
- type: flores_101
72
- args: afr nld devtest
73
- metrics:
74
- - name: BLEU
75
- type: bleu
76
- value: 22.5
77
- - name: chr-F
78
- type: chrf
79
- value: 0.54441
80
- - task:
81
- name: Translation deu-afr
82
- type: translation
83
- args: deu-afr
84
- dataset:
85
- name: flores101-devtest
86
- type: flores_101
87
- args: deu afr devtest
88
- metrics:
89
- - name: BLEU
90
- type: bleu
91
- value: 26.4
92
- - name: chr-F
93
- type: chrf
94
- value: 0.57835
95
- - task:
96
- name: Translation deu-eng
97
- type: translation
98
- args: deu-eng
99
- dataset:
100
- name: flores101-devtest
101
- type: flores_101
102
- args: deu eng devtest
103
- metrics:
104
- - name: BLEU
105
- type: bleu
106
- value: 41.8
107
- - name: chr-F
108
- type: chrf
109
- value: 0.66990
110
- - task:
111
- name: Translation deu-ltz
112
- type: translation
113
- args: deu-ltz
114
- dataset:
115
- name: flores101-devtest
116
- type: flores_101
117
- args: deu ltz devtest
118
- metrics:
119
- - name: BLEU
120
- type: bleu
121
- value: 20.3
122
- - name: chr-F
123
- type: chrf
124
- value: 0.52554
125
- - task:
126
- name: Translation deu-nld
127
- type: translation
128
- args: deu-nld
129
- dataset:
130
- name: flores101-devtest
131
- type: flores_101
132
- args: deu nld devtest
133
- metrics:
134
- - name: BLEU
135
- type: bleu
136
- value: 24.2
137
- - name: chr-F
138
- type: chrf
139
- value: 0.55710
140
- - task:
141
- name: Translation eng-afr
142
- type: translation
143
- args: eng-afr
144
- dataset:
145
- name: flores101-devtest
146
- type: flores_101
147
- args: eng afr devtest
148
- metrics:
149
- - name: BLEU
150
- type: bleu
151
- value: 40.7
152
- - name: chr-F
153
- type: chrf
154
- value: 0.68429
155
- - task:
156
- name: Translation eng-deu
157
- type: translation
158
- args: eng-deu
159
- dataset:
160
- name: flores101-devtest
161
- type: flores_101
162
- args: eng deu devtest
163
- metrics:
164
- - name: BLEU
165
- type: bleu
166
- value: 38.5
167
- - name: chr-F
168
- type: chrf
169
- value: 0.64888
170
- - task:
171
- name: Translation eng-ltz
172
- type: translation
173
- args: eng-ltz
174
- dataset:
175
- name: flores101-devtest
176
- type: flores_101
177
- args: eng ltz devtest
178
- metrics:
179
- - name: BLEU
180
- type: bleu
181
- value: 18.4
182
- - name: chr-F
183
- type: chrf
184
- value: 0.49231
185
- - task:
186
- name: Translation eng-nld
187
- type: translation
188
- args: eng-nld
189
- dataset:
190
- name: flores101-devtest
191
- type: flores_101
192
- args: eng nld devtest
193
- metrics:
194
- - name: BLEU
195
- type: bleu
196
- value: 26.8
197
- - name: chr-F
198
- type: chrf
199
- value: 0.57984
200
- - task:
201
- name: Translation ltz-afr
202
- type: translation
203
- args: ltz-afr
204
- dataset:
205
- name: flores101-devtest
206
- type: flores_101
207
- args: ltz afr devtest
208
- metrics:
209
- - name: BLEU
210
- type: bleu
211
- value: 23.2
212
- - name: chr-F
213
- type: chrf
214
- value: 0.53623
215
- - task:
216
- name: Translation ltz-deu
217
- type: translation
218
- args: ltz-deu
219
- dataset:
220
- name: flores101-devtest
221
- type: flores_101
222
- args: ltz deu devtest
223
- metrics:
224
- - name: BLEU
225
- type: bleu
226
- value: 30.0
227
- - name: chr-F
228
- type: chrf
229
- value: 0.59122
230
- - task:
231
- name: Translation ltz-eng
232
- type: translation
233
- args: ltz-eng
234
- dataset:
235
- name: flores101-devtest
236
- type: flores_101
237
- args: ltz eng devtest
238
- metrics:
239
- - name: BLEU
240
- type: bleu
241
- value: 31.0
242
- - name: chr-F
243
- type: chrf
244
- value: 0.57557
245
- - task:
246
- name: Translation ltz-nld
247
- type: translation
248
- args: ltz-nld
249
- dataset:
250
- name: flores101-devtest
251
- type: flores_101
252
- args: ltz nld devtest
253
- metrics:
254
- - name: BLEU
255
- type: bleu
256
- value: 18.6
257
- - name: chr-F
258
- type: chrf
259
- value: 0.49312
260
  - task:
261
- name: Translation nld-afr
262
  type: translation
263
- args: nld-afr
264
- dataset:
265
- name: flores101-devtest
266
- type: flores_101
267
- args: nld afr devtest
268
- metrics:
269
- - name: BLEU
270
- type: bleu
271
- value: 20.0
272
- - name: chr-F
273
- type: chrf
274
- value: 0.52409
275
- - task:
276
- name: Translation nld-deu
277
- type: translation
278
- args: nld-deu
279
- dataset:
280
- name: flores101-devtest
281
- type: flores_101
282
- args: nld deu devtest
283
- metrics:
284
- - name: BLEU
285
- type: bleu
286
- value: 22.6
287
- - name: chr-F
288
- type: chrf
289
- value: 0.53898
290
- - task:
291
- name: Translation nld-eng
292
- type: translation
293
- args: nld-eng
294
- dataset:
295
- name: flores101-devtest
296
- type: flores_101
297
- args: nld eng devtest
298
- metrics:
299
- - name: BLEU
300
- type: bleu
301
- value: 30.7
302
- - name: chr-F
303
- type: chrf
304
- value: 0.58970
305
- - task:
306
- name: Translation nld-ltz
307
- type: translation
308
- args: nld-ltz
309
- dataset:
310
- name: flores101-devtest
311
- type: flores_101
312
- args: nld ltz devtest
313
- metrics:
314
- - name: BLEU
315
- type: bleu
316
- value: 11.8
317
- - name: chr-F
318
- type: chrf
319
- value: 0.42637
320
- - task:
321
  name: Translation deu-eng
322
- type: translation
323
- args: deu-eng
324
  dataset:
325
  name: multi30k_test_2016_flickr
326
  type: multi30k-2016_flickr
327
  args: deu-eng
328
  metrics:
329
- - name: BLEU
330
- type: bleu
331
- value: 39.9
332
- - name: chr-F
333
- type: chrf
334
- value: 0.60928
 
 
 
 
 
 
335
  - task:
336
- name: Translation eng-deu
337
  type: translation
338
- args: eng-deu
339
- dataset:
340
- name: multi30k_test_2016_flickr
341
- type: multi30k-2016_flickr
342
- args: eng-deu
343
- metrics:
344
- - name: BLEU
345
- type: bleu
346
- value: 35.4
347
- - name: chr-F
348
- type: chrf
349
- value: 0.64172
350
- - task:
351
  name: Translation deu-eng
352
- type: translation
353
- args: deu-eng
354
  dataset:
355
  name: multi30k_test_2017_flickr
356
  type: multi30k-2017_flickr
357
  args: deu-eng
358
  metrics:
359
- - name: BLEU
360
- type: bleu
361
- value: 40.5
362
- - name: chr-F
363
- type: chrf
364
- value: 0.63154
 
 
 
 
 
 
365
  - task:
366
- name: Translation eng-deu
367
  type: translation
368
- args: eng-deu
369
- dataset:
370
- name: multi30k_test_2017_flickr
371
- type: multi30k-2017_flickr
372
- args: eng-deu
373
- metrics:
374
- - name: BLEU
375
- type: bleu
376
- value: 34.2
377
- - name: chr-F
378
- type: chrf
379
- value: 0.63078
380
- - task:
381
  name: Translation deu-eng
382
- type: translation
383
- args: deu-eng
384
  dataset:
385
  name: multi30k_test_2017_mscoco
386
  type: multi30k-2017_mscoco
387
  args: deu-eng
388
  metrics:
389
- - name: BLEU
390
- type: bleu
391
- value: 32.2
392
- - name: chr-F
393
- type: chrf
394
- value: 0.55708
 
 
 
 
 
 
395
  - task:
396
- name: Translation eng-deu
397
  type: translation
398
- args: eng-deu
399
- dataset:
400
- name: multi30k_test_2017_mscoco
401
- type: multi30k-2017_mscoco
402
- args: eng-deu
403
- metrics:
404
- - name: BLEU
405
- type: bleu
406
- value: 29.1
407
- - name: chr-F
408
- type: chrf
409
- value: 0.57537
410
- - task:
411
  name: Translation deu-eng
412
- type: translation
413
- args: deu-eng
414
  dataset:
415
  name: multi30k_test_2018_flickr
416
  type: multi30k-2018_flickr
417
  args: deu-eng
418
  metrics:
419
- - name: BLEU
420
- type: bleu
421
- value: 36.9
422
- - name: chr-F
423
- type: chrf
424
- value: 0.59422
 
 
 
 
 
 
425
  - task:
426
- name: Translation eng-deu
427
  type: translation
428
- args: eng-deu
429
- dataset:
430
- name: multi30k_test_2018_flickr
431
- type: multi30k-2018_flickr
432
- args: eng-deu
433
- metrics:
434
- - name: BLEU
435
- type: bleu
436
- value: 30.0
437
- - name: chr-F
438
- type: chrf
439
- value: 0.59597
440
- - task:
441
  name: Translation deu-eng
442
- type: translation
443
- args: deu-eng
444
  dataset:
445
  name: news-test2008
446
  type: news-test2008
447
  args: deu-eng
448
  metrics:
449
- - name: BLEU
450
- type: bleu
451
- value: 27.2
452
- - name: chr-F
453
- type: chrf
454
- value: 0.54601
 
 
 
 
 
 
455
  - task:
456
- name: Translation eng-deu
457
  type: translation
458
- args: eng-deu
459
- dataset:
460
- name: news-test2008
461
- type: news-test2008
462
- args: eng-deu
463
- metrics:
464
- - name: BLEU
465
- type: bleu
466
- value: 23.6
467
- - name: chr-F
468
- type: chrf
469
- value: 0.53149
470
- - task:
471
  name: Translation afr-deu
472
- type: translation
473
- args: afr-deu
474
  dataset:
475
  name: tatoeba-test-v2021-08-07
476
  type: tatoeba_mt
477
  args: afr-deu
478
  metrics:
479
- - name: BLEU
480
- type: bleu
481
- value: 50.4
482
- - name: chr-F
483
- type: chrf
484
- value: 0.68679
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
485
  - task:
486
- name: Translation afr-eng
487
  type: translation
488
- args: afr-eng
489
- dataset:
490
- name: tatoeba-test-v2021-08-07
491
- type: tatoeba_mt
492
- args: afr-eng
493
- metrics:
494
- - name: BLEU
495
- type: bleu
496
- value: 56.6
497
- - name: chr-F
498
- type: chrf
499
- value: 0.70682
500
- - task:
501
- name: Translation afr-nld
502
- type: translation
503
- args: afr-nld
504
- dataset:
505
- name: tatoeba-test-v2021-08-07
506
- type: tatoeba_mt
507
- args: afr-nld
508
- metrics:
509
- - name: BLEU
510
- type: bleu
511
- value: 55.5
512
- - name: chr-F
513
- type: chrf
514
- value: 0.71516
515
- - task:
516
- name: Translation deu-afr
517
- type: translation
518
- args: deu-afr
519
- dataset:
520
- name: tatoeba-test-v2021-08-07
521
- type: tatoeba_mt
522
- args: deu-afr
523
- metrics:
524
- - name: BLEU
525
- type: bleu
526
- value: 54.3
527
- - name: chr-F
528
- type: chrf
529
- value: 0.70274
530
- - task:
531
  name: Translation deu-eng
532
- type: translation
533
- args: deu-eng
534
- dataset:
535
- name: tatoeba-test-v2021-08-07
536
- type: tatoeba_mt
537
- args: deu-eng
538
- metrics:
539
- - name: BLEU
540
- type: bleu
541
- value: 48.6
542
- - name: chr-F
543
- type: chrf
544
- value: 0.66023
545
- - task:
546
- name: Translation deu-nds
547
- type: translation
548
- args: deu-nds
549
- dataset:
550
- name: tatoeba-test-v2021-08-07
551
- type: tatoeba_mt
552
- args: deu-nds
553
- metrics:
554
- - name: BLEU
555
- type: bleu
556
- value: 23.2
557
- - name: chr-F
558
- type: chrf
559
- value: 0.48058
560
- - task:
561
- name: Translation deu-nld
562
- type: translation
563
- args: deu-nld
564
- dataset:
565
- name: tatoeba-test-v2021-08-07
566
- type: tatoeba_mt
567
- args: deu-nld
568
- metrics:
569
- - name: BLEU
570
- type: bleu
571
- value: 54.6
572
- - name: chr-F
573
- type: chrf
574
- value: 0.71440
575
- - task:
576
- name: Translation eng-afr
577
- type: translation
578
- args: eng-afr
579
- dataset:
580
- name: tatoeba-test-v2021-08-07
581
- type: tatoeba_mt
582
- args: eng-afr
583
- metrics:
584
- - name: BLEU
585
- type: bleu
586
- value: 56.5
587
- - name: chr-F
588
- type: chrf
589
- value: 0.71995
590
- - task:
591
- name: Translation eng-deu
592
- type: translation
593
- args: eng-deu
594
- dataset:
595
- name: tatoeba-test-v2021-08-07
596
- type: tatoeba_mt
597
- args: eng-deu
598
- metrics:
599
- - name: BLEU
600
- type: bleu
601
- value: 42.0
602
- - name: chr-F
603
- type: chrf
604
- value: 0.63103
605
- - task:
606
- name: Translation eng-fry
607
- type: translation
608
- args: eng-fry
609
- dataset:
610
- name: tatoeba-test-v2021-03-30
611
- type: tatoeba_mt
612
- args: eng-fry
613
- metrics:
614
- - name: BLEU
615
- type: bleu
616
- value: 21.3
617
- - name: chr-F
618
- type: chrf
619
- value: 0.38580
620
- - task:
621
- name: Translation eng-nld
622
- type: translation
623
- args: eng-nld
624
- dataset:
625
- name: tatoeba-test-v2021-08-07
626
- type: tatoeba_mt
627
- args: eng-nld
628
- metrics:
629
- - name: BLEU
630
- type: bleu
631
- value: 54.5
632
- - name: chr-F
633
- type: chrf
634
- value: 0.71062
635
- - task:
636
- name: Translation fry-eng
637
- type: translation
638
- args: fry-eng
639
- dataset:
640
- name: tatoeba-test-v2021-08-07
641
- type: tatoeba_mt
642
- args: fry-eng
643
- metrics:
644
- - name: BLEU
645
- type: bleu
646
- value: 25.1
647
- - name: chr-F
648
- type: chrf
649
- value: 0.40545
650
- - task:
651
- name: Translation fry-nld
652
- type: translation
653
- args: fry-nld
654
- dataset:
655
- name: tatoeba-test-v2021-08-07
656
- type: tatoeba_mt
657
- args: fry-nld
658
- metrics:
659
- - name: BLEU
660
- type: bleu
661
- value: 41.7
662
- - name: chr-F
663
- type: chrf
664
- value: 0.55771
665
- - task:
666
- name: Translation gos-deu
667
- type: translation
668
- args: gos-deu
669
- dataset:
670
- name: tatoeba-test-v2021-08-07
671
- type: tatoeba_mt
672
- args: gos-deu
673
- metrics:
674
- - name: BLEU
675
- type: bleu
676
- value: 25.4
677
- - name: chr-F
678
- type: chrf
679
- value: 0.45302
680
- - task:
681
- name: Translation gos-eng
682
- type: translation
683
- args: gos-eng
684
- dataset:
685
- name: tatoeba-test-v2021-08-07
686
- type: tatoeba_mt
687
- args: gos-eng
688
- metrics:
689
- - name: BLEU
690
- type: bleu
691
- value: 24.1
692
- - name: chr-F
693
- type: chrf
694
- value: 0.37628
695
- - task:
696
- name: Translation gos-nld
697
- type: translation
698
- args: gos-nld
699
- dataset:
700
- name: tatoeba-test-v2021-08-07
701
- type: tatoeba_mt
702
- args: gos-nld
703
- metrics:
704
- - name: BLEU
705
- type: bleu
706
- value: 26.2
707
- - name: chr-F
708
- type: chrf
709
- value: 0.45777
710
- - task:
711
- name: Translation ltz-deu
712
- type: translation
713
- args: ltz-deu
714
- dataset:
715
- name: tatoeba-test-v2021-08-07
716
- type: tatoeba_mt
717
- args: ltz-deu
718
- metrics:
719
- - name: BLEU
720
- type: bleu
721
- value: 21.3
722
- - name: chr-F
723
- type: chrf
724
- value: 0.37165
725
- - task:
726
- name: Translation ltz-eng
727
- type: translation
728
- args: ltz-eng
729
- dataset:
730
- name: tatoeba-test-v2021-08-07
731
- type: tatoeba_mt
732
- args: ltz-eng
733
- metrics:
734
- - name: BLEU
735
- type: bleu
736
- value: 30.3
737
- - name: chr-F
738
- type: chrf
739
- value: 0.37784
740
- - task:
741
- name: Translation ltz-nld
742
- type: translation
743
- args: ltz-nld
744
- dataset:
745
- name: tatoeba-test-v2021-08-07
746
- type: tatoeba_mt
747
- args: ltz-nld
748
- metrics:
749
- - name: BLEU
750
- type: bleu
751
- value: 26.7
752
- - name: chr-F
753
- type: chrf
754
- value: 0.32823
755
- - task:
756
- name: Translation nds-deu
757
- type: translation
758
- args: nds-deu
759
- dataset:
760
- name: tatoeba-test-v2021-08-07
761
- type: tatoeba_mt
762
- args: nds-deu
763
- metrics:
764
- - name: BLEU
765
- type: bleu
766
- value: 45.4
767
- - name: chr-F
768
- type: chrf
769
- value: 0.64008
770
- - task:
771
- name: Translation nds-eng
772
- type: translation
773
- args: nds-eng
774
- dataset:
775
- name: tatoeba-test-v2021-08-07
776
- type: tatoeba_mt
777
- args: nds-eng
778
- metrics:
779
- - name: BLEU
780
- type: bleu
781
- value: 38.3
782
- - name: chr-F
783
- type: chrf
784
- value: 0.55193
785
- - task:
786
- name: Translation nds-nld
787
- type: translation
788
- args: nds-nld
789
- dataset:
790
- name: tatoeba-test-v2021-08-07
791
- type: tatoeba_mt
792
- args: nds-nld
793
- metrics:
794
- - name: BLEU
795
- type: bleu
796
- value: 50.0
797
- - name: chr-F
798
- type: chrf
799
- value: 0.66943
800
- - task:
801
- name: Translation nld-afr
802
- type: translation
803
- args: nld-afr
804
- dataset:
805
- name: tatoeba-test-v2021-08-07
806
- type: tatoeba_mt
807
- args: nld-afr
808
- metrics:
809
- - name: BLEU
810
- type: bleu
811
- value: 62.3
812
- - name: chr-F
813
- type: chrf
814
- value: 0.76610
815
- - task:
816
- name: Translation nld-deu
817
- type: translation
818
- args: nld-deu
819
- dataset:
820
- name: tatoeba-test-v2021-08-07
821
- type: tatoeba_mt
822
- args: nld-deu
823
- metrics:
824
- - name: BLEU
825
- type: bleu
826
- value: 56.8
827
- - name: chr-F
828
- type: chrf
829
- value: 0.73162
830
- - task:
831
- name: Translation nld-eng
832
- type: translation
833
- args: nld-eng
834
- dataset:
835
- name: tatoeba-test-v2021-08-07
836
- type: tatoeba_mt
837
- args: nld-eng
838
- metrics:
839
- - name: BLEU
840
- type: bleu
841
- value: 60.5
842
- - name: chr-F
843
- type: chrf
844
- value: 0.74088
845
- - task:
846
- name: Translation nld-fry
847
- type: translation
848
- args: nld-fry
849
- dataset:
850
- name: tatoeba-test-v2021-08-07
851
- type: tatoeba_mt
852
- args: nld-fry
853
- metrics:
854
- - name: BLEU
855
- type: bleu
856
- value: 31.4
857
- - name: chr-F
858
- type: chrf
859
- value: 0.48460
860
- - task:
861
- name: Translation deu-eng
862
- type: translation
863
- args: deu-eng
864
  dataset:
865
  name: newstest2009
866
  type: wmt-2009-news
867
  args: deu-eng
868
  metrics:
869
- - name: BLEU
870
- type: bleu
871
- value: 25.9
872
- - name: chr-F
873
- type: chrf
874
- value: 0.53747
 
 
 
 
 
 
875
  - task:
876
- name: Translation eng-deu
877
  type: translation
878
- args: eng-deu
879
- dataset:
880
- name: newstest2009
881
- type: wmt-2009-news
882
- args: eng-deu
883
- metrics:
884
- - name: BLEU
885
- type: bleu
886
- value: 22.9
887
- - name: chr-F
888
- type: chrf
889
- value: 0.53283
890
- - task:
891
  name: Translation deu-eng
892
- type: translation
893
- args: deu-eng
894
  dataset:
895
  name: newstest2010
896
  type: wmt-2010-news
897
  args: deu-eng
898
  metrics:
899
- - name: BLEU
900
- type: bleu
901
- value: 30.6
902
- - name: chr-F
903
- type: chrf
904
- value: 0.58355
 
 
 
 
 
 
905
  - task:
906
- name: Translation eng-deu
907
  type: translation
908
- args: eng-deu
909
- dataset:
910
- name: newstest2010
911
- type: wmt-2010-news
912
- args: eng-deu
913
- metrics:
914
- - name: BLEU
915
- type: bleu
916
- value: 25.8
917
- - name: chr-F
918
- type: chrf
919
- value: 0.54885
920
- - task:
921
  name: Translation deu-eng
922
- type: translation
923
- args: deu-eng
924
  dataset:
925
  name: newstest2011
926
  type: wmt-2011-news
927
  args: deu-eng
928
  metrics:
929
- - name: BLEU
930
- type: bleu
931
- value: 26.3
932
- - name: chr-F
933
- type: chrf
934
- value: 0.54883
 
 
 
 
 
 
935
  - task:
936
- name: Translation eng-deu
937
  type: translation
938
- args: eng-deu
939
- dataset:
940
- name: newstest2011
941
- type: wmt-2011-news
942
- args: eng-deu
943
- metrics:
944
- - name: BLEU
945
- type: bleu
946
- value: 23.1
947
- - name: chr-F
948
- type: chrf
949
- value: 0.52712
950
- - task:
951
  name: Translation deu-eng
952
- type: translation
953
- args: deu-eng
954
  dataset:
955
  name: newstest2012
956
  type: wmt-2012-news
957
  args: deu-eng
958
  metrics:
959
- - name: BLEU
960
- type: bleu
961
- value: 28.5
962
- - name: chr-F
963
- type: chrf
964
- value: 0.56153
 
 
 
 
 
 
965
  - task:
966
- name: Translation eng-deu
967
  type: translation
968
- args: eng-deu
969
- dataset:
970
- name: newstest2012
971
- type: wmt-2012-news
972
- args: eng-deu
973
- metrics:
974
- - name: BLEU
975
- type: bleu
976
- value: 23.3
977
- - name: chr-F
978
- type: chrf
979
- value: 0.52662
980
- - task:
981
  name: Translation deu-eng
982
- type: translation
983
- args: deu-eng
984
  dataset:
985
  name: newstest2013
986
  type: wmt-2013-news
987
  args: deu-eng
988
  metrics:
989
- - name: BLEU
990
- type: bleu
991
- value: 31.4
992
- - name: chr-F
993
- type: chrf
994
- value: 0.57770
 
 
 
 
 
 
995
  - task:
996
- name: Translation eng-deu
997
  type: translation
998
- args: eng-deu
999
- dataset:
1000
- name: newstest2013
1001
- type: wmt-2013-news
1002
- args: eng-deu
1003
- metrics:
1004
- - name: BLEU
1005
- type: bleu
1006
- value: 27.8
1007
- - name: chr-F
1008
- type: chrf
1009
- value: 0.55774
1010
- - task:
1011
  name: Translation deu-eng
1012
- type: translation
1013
- args: deu-eng
1014
  dataset:
1015
  name: newstest2014
1016
  type: wmt-2014-news
1017
  args: deu-eng
1018
  metrics:
1019
- - name: BLEU
1020
- type: bleu
1021
- value: 33.2
1022
- - name: chr-F
1023
- type: chrf
1024
- value: 0.59826
 
 
 
 
 
 
1025
  - task:
1026
- name: Translation eng-deu
1027
  type: translation
1028
- args: eng-deu
1029
- dataset:
1030
- name: newstest2014
1031
- type: wmt-2014-news
1032
- args: eng-deu
1033
- metrics:
1034
- - name: BLEU
1035
- type: bleu
1036
- value: 29.0
1037
- - name: chr-F
1038
- type: chrf
1039
- value: 0.59301
1040
- - task:
1041
  name: Translation deu-eng
1042
- type: translation
1043
- args: deu-eng
1044
  dataset:
1045
  name: newstest2015
1046
  type: wmt-2015-news
1047
  args: deu-eng
1048
  metrics:
1049
- - name: BLEU
1050
- type: bleu
1051
- value: 33.4
1052
- - name: chr-F
1053
- type: chrf
1054
- value: 0.59660
 
 
 
 
 
 
1055
  - task:
1056
- name: Translation eng-deu
1057
  type: translation
1058
- args: eng-deu
1059
- dataset:
1060
- name: newstest2015
1061
- type: wmt-2015-news
1062
- args: eng-deu
1063
- metrics:
1064
- - name: BLEU
1065
- type: bleu
1066
- value: 32.3
1067
- - name: chr-F
1068
- type: chrf
1069
- value: 0.59889
1070
- - task:
1071
  name: Translation deu-eng
1072
- type: translation
1073
- args: deu-eng
1074
  dataset:
1075
  name: newstest2016
1076
  type: wmt-2016-news
1077
  args: deu-eng
1078
  metrics:
1079
- - name: BLEU
1080
- type: bleu
1081
- value: 39.8
1082
- - name: chr-F
1083
- type: chrf
1084
- value: 0.64736
 
 
 
 
 
 
1085
  - task:
1086
- name: Translation eng-deu
1087
  type: translation
1088
- args: eng-deu
1089
- dataset:
1090
- name: newstest2016
1091
- type: wmt-2016-news
1092
- args: eng-deu
1093
- metrics:
1094
- - name: BLEU
1095
- type: bleu
1096
- value: 38.3
1097
- - name: chr-F
1098
- type: chrf
1099
- value: 0.64427
1100
- - task:
1101
  name: Translation deu-eng
1102
- type: translation
1103
- args: deu-eng
1104
  dataset:
1105
  name: newstest2017
1106
  type: wmt-2017-news
1107
  args: deu-eng
1108
  metrics:
1109
- - name: BLEU
1110
- type: bleu
1111
- value: 35.2
1112
- - name: chr-F
1113
- type: chrf
1114
- value: 0.60933
 
 
 
 
 
 
1115
  - task:
1116
- name: Translation eng-deu
1117
  type: translation
1118
- args: eng-deu
1119
- dataset:
1120
- name: newstest2017
1121
- type: wmt-2017-news
1122
- args: eng-deu
1123
- metrics:
1124
- - name: BLEU
1125
- type: bleu
1126
- value: 30.7
1127
- - name: chr-F
1128
- type: chrf
1129
- value: 0.59257
1130
- - task:
1131
  name: Translation deu-eng
1132
- type: translation
1133
- args: deu-eng
1134
  dataset:
1135
  name: newstest2018
1136
  type: wmt-2018-news
1137
  args: deu-eng
1138
  metrics:
1139
- - name: BLEU
1140
- type: bleu
1141
- value: 42.6
1142
- - name: chr-F
1143
- type: chrf
1144
- value: 0.66797
 
 
 
 
 
 
1145
  - task:
1146
- name: Translation eng-deu
1147
  type: translation
1148
- args: eng-deu
1149
- dataset:
1150
- name: newstest2018
1151
- type: wmt-2018-news
1152
- args: eng-deu
1153
- metrics:
1154
- - name: BLEU
1155
- type: bleu
1156
- value: 46.5
1157
- - name: chr-F
1158
- type: chrf
1159
- value: 0.69605
1160
- - task:
1161
  name: Translation deu-eng
1162
- type: translation
1163
- args: deu-eng
1164
  dataset:
1165
  name: newstest2019
1166
  type: wmt-2019-news
1167
  args: deu-eng
1168
  metrics:
1169
- - name: BLEU
1170
- type: bleu
1171
- value: 39.7
1172
- - name: chr-F
1173
- type: chrf
1174
- value: 0.63749
 
 
 
 
 
 
1175
  - task:
1176
- name: Translation eng-deu
1177
  type: translation
1178
- args: eng-deu
1179
- dataset:
1180
- name: newstest2019
1181
- type: wmt-2019-news
1182
- args: eng-deu
1183
- metrics:
1184
- - name: BLEU
1185
- type: bleu
1186
- value: 42.9
1187
- - name: chr-F
1188
- type: chrf
1189
- value: 0.66751
1190
- - task:
1191
  name: Translation deu-eng
1192
- type: translation
1193
- args: deu-eng
1194
  dataset:
1195
  name: newstest2020
1196
  type: wmt-2020-news
1197
  args: deu-eng
1198
  metrics:
1199
- - name: BLEU
1200
- type: bleu
1201
- value: 35.0
1202
- - name: chr-F
1203
- type: chrf
1204
- value: 0.61200
1205
- - task:
1206
- name: Translation eng-deu
1207
- type: translation
1208
- args: eng-deu
1209
- dataset:
1210
- name: newstest2020
1211
- type: wmt-2020-news
1212
- args: eng-deu
1213
- metrics:
1214
- - name: BLEU
1215
- type: bleu
1216
- value: 32.3
1217
- - name: chr-F
1218
- type: chrf
1219
- value: 0.60411
1220
  ---
1221
  # opus-mt-tc-big-gmw-gmw
1222
 
@@ -1273,7 +704,7 @@ from transformers import MarianMTModel, MarianTokenizer
1273
 
1274
  src_text = [
1275
  ">>nds<< Red keinen Quatsch.",
1276
- ">>eng<< Findet ihr das nicht etwas übereilt?"
1277
  ]
1278
 
1279
  model_name = "pytorch-models/opus-mt-tc-big-gmw-gmw"
@@ -1408,7 +839,7 @@ print(pipe(">>nds<< Red keinen Quatsch."))
1408
 
1409
  ## Citation Information
1410
 
1411
- * Publications: [OPUS-MT Building open translation services for the World](https://aclanthology.org/2020.eamt-1.61/) and [The Tatoeba Translation Challenge Realistic Data Sets for Low Resource and Multilingual MT](https://aclanthology.org/2020.wmt-1.139/) (Please, cite if you use this model.)
1412
 
1413
  ```
1414
  @inproceedings{tiedemann-thottingal-2020-opus,
@@ -1438,7 +869,7 @@ print(pipe(">>nds<< Red keinen Quatsch."))
1438
 
1439
  ## Acknowledgements
1440
 
1441
- The work is supported by the [European Language Grid](https://www.european-language-grid.eu/) as [pilot project 2866](https://live.european-language-grid.eu/catalogue/#/resource/projects/2866), by the [FoTran project](https://www.helsinki.fi/en/researchgroups/natural-language-understanding-with-cross-lingual-grounding), funded by the European Research Council (ERC) under the European Unions Horizon 2020 research and innovation programme (grant agreement No 771113), and the [MeMAD project](https://memad.eu/), funded by the European Unions Horizon 2020 Research and Innovation Programme under grant agreement No 780069. We are also grateful for the generous computational resources and IT infrastructure provided by [CSC -- IT Center for Science](https://www.csc.fi/), Finland.
1442
 
1443
  ## Model conversion info
1444
 
 
8
  - lb
9
  - nds
10
  - nl
11
+ - multilingual
12
+ license: cc-by-4.0
13
  tags:
14
  - translation
15
  - opus-mt-tc
 
 
16
  model-index:
17
  - name: opus-mt-tc-big-gmw-gmw
18
  results:
19
  - task:
 
20
  type: translation
21
+ name: Translation afr-deu
22
  dataset:
23
  name: flores101-devtest
24
  type: flores_101
25
  args: afr deu devtest
26
  metrics:
27
+ - type: bleu
28
+ value: 30.2
29
+ name: BLEU
30
+ - type: chrf
31
+ value: 0.58718
32
+ name: chr-F
33
+ - type: bleu
34
+ value: 55.1
35
+ name: BLEU
36
+ - type: chrf
37
+ value: 0.74826
38
+ name: chr-F
39
+ - type: bleu
40
+ value: 15.7
41
+ name: BLEU
42
+ - type: chrf
43
+ value: 0.46826
44
+ name: chr-F
45
+ - type: bleu
46
+ value: 22.5
47
+ name: BLEU
48
+ - type: chrf
49
+ value: 0.54441
50
+ name: chr-F
51
+ - type: bleu
52
+ value: 26.4
53
+ name: BLEU
54
+ - type: chrf
55
+ value: 0.57835
56
+ name: chr-F
57
+ - type: bleu
58
+ value: 41.8
59
+ name: BLEU
60
+ - type: chrf
61
+ value: 0.6699
62
+ name: chr-F
63
+ - type: bleu
64
+ value: 20.3
65
+ name: BLEU
66
+ - type: chrf
67
+ value: 0.52554
68
+ name: chr-F
69
+ - type: bleu
70
+ value: 24.2
71
+ name: BLEU
72
+ - type: chrf
73
+ value: 0.5571
74
+ name: chr-F
75
+ - type: bleu
76
+ value: 40.7
77
+ name: BLEU
78
+ - type: chrf
79
+ value: 0.68429
80
+ name: chr-F
81
+ - type: bleu
82
+ value: 38.5
83
+ name: BLEU
84
+ - type: chrf
85
+ value: 0.64888
86
+ name: chr-F
87
+ - type: bleu
88
+ value: 18.4
89
+ name: BLEU
90
+ - type: chrf
91
+ value: 0.49231
92
+ name: chr-F
93
+ - type: bleu
94
+ value: 26.8
95
+ name: BLEU
96
+ - type: chrf
97
+ value: 0.57984
98
+ name: chr-F
99
+ - type: bleu
100
+ value: 23.2
101
+ name: BLEU
102
+ - type: chrf
103
+ value: 0.53623
104
+ name: chr-F
105
+ - type: bleu
106
+ value: 30.0
107
+ name: BLEU
108
+ - type: chrf
109
+ value: 0.59122
110
+ name: chr-F
111
+ - type: bleu
112
+ value: 31.0
113
+ name: BLEU
114
+ - type: chrf
115
+ value: 0.57557
116
+ name: chr-F
117
+ - type: bleu
118
+ value: 18.6
119
+ name: BLEU
120
+ - type: chrf
121
+ value: 0.49312
122
+ name: chr-F
123
+ - type: bleu
124
+ value: 20.0
125
+ name: BLEU
126
+ - type: chrf
127
+ value: 0.52409
128
+ name: chr-F
129
+ - type: bleu
130
+ value: 22.6
131
+ name: BLEU
132
+ - type: chrf
133
+ value: 0.53898
134
+ name: chr-F
135
+ - type: bleu
136
+ value: 30.7
137
+ name: BLEU
138
+ - type: chrf
139
+ value: 0.5897
140
+ name: chr-F
141
+ - type: bleu
142
+ value: 11.8
143
+ name: BLEU
144
+ - type: chrf
145
+ value: 0.42637
146
+ name: chr-F
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
147
  - task:
 
148
  type: translation
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
149
  name: Translation deu-eng
 
 
150
  dataset:
151
  name: multi30k_test_2016_flickr
152
  type: multi30k-2016_flickr
153
  args: deu-eng
154
  metrics:
155
+ - type: bleu
156
+ value: 39.9
157
+ name: BLEU
158
+ - type: chrf
159
+ value: 0.60928
160
+ name: chr-F
161
+ - type: bleu
162
+ value: 35.4
163
+ name: BLEU
164
+ - type: chrf
165
+ value: 0.64172
166
+ name: chr-F
167
  - task:
 
168
  type: translation
 
 
 
 
 
 
 
 
 
 
 
 
 
169
  name: Translation deu-eng
 
 
170
  dataset:
171
  name: multi30k_test_2017_flickr
172
  type: multi30k-2017_flickr
173
  args: deu-eng
174
  metrics:
175
+ - type: bleu
176
+ value: 40.5
177
+ name: BLEU
178
+ - type: chrf
179
+ value: 0.63154
180
+ name: chr-F
181
+ - type: bleu
182
+ value: 34.2
183
+ name: BLEU
184
+ - type: chrf
185
+ value: 0.63078
186
+ name: chr-F
187
  - task:
 
188
  type: translation
 
 
 
 
 
 
 
 
 
 
 
 
 
189
  name: Translation deu-eng
 
 
190
  dataset:
191
  name: multi30k_test_2017_mscoco
192
  type: multi30k-2017_mscoco
193
  args: deu-eng
194
  metrics:
195
+ - type: bleu
196
+ value: 32.2
197
+ name: BLEU
198
+ - type: chrf
199
+ value: 0.55708
200
+ name: chr-F
201
+ - type: bleu
202
+ value: 29.1
203
+ name: BLEU
204
+ - type: chrf
205
+ value: 0.57537
206
+ name: chr-F
207
  - task:
 
208
  type: translation
 
 
 
 
 
 
 
 
 
 
 
 
 
209
  name: Translation deu-eng
 
 
210
  dataset:
211
  name: multi30k_test_2018_flickr
212
  type: multi30k-2018_flickr
213
  args: deu-eng
214
  metrics:
215
+ - type: bleu
216
+ value: 36.9
217
+ name: BLEU
218
+ - type: chrf
219
+ value: 0.59422
220
+ name: chr-F
221
+ - type: bleu
222
+ value: 30.0
223
+ name: BLEU
224
+ - type: chrf
225
+ value: 0.59597
226
+ name: chr-F
227
  - task:
 
228
  type: translation
 
 
 
 
 
 
 
 
 
 
 
 
 
229
  name: Translation deu-eng
 
 
230
  dataset:
231
  name: news-test2008
232
  type: news-test2008
233
  args: deu-eng
234
  metrics:
235
+ - type: bleu
236
+ value: 27.2
237
+ name: BLEU
238
+ - type: chrf
239
+ value: 0.54601
240
+ name: chr-F
241
+ - type: bleu
242
+ value: 23.6
243
+ name: BLEU
244
+ - type: chrf
245
+ value: 0.53149
246
+ name: chr-F
247
  - task:
 
248
  type: translation
 
 
 
 
 
 
 
 
 
 
 
 
 
249
  name: Translation afr-deu
 
 
250
  dataset:
251
  name: tatoeba-test-v2021-08-07
252
  type: tatoeba_mt
253
  args: afr-deu
254
  metrics:
255
+ - type: bleu
256
+ value: 50.4
257
+ name: BLEU
258
+ - type: chrf
259
+ value: 0.68679
260
+ name: chr-F
261
+ - type: bleu
262
+ value: 56.6
263
+ name: BLEU
264
+ - type: chrf
265
+ value: 0.70682
266
+ name: chr-F
267
+ - type: bleu
268
+ value: 55.5
269
+ name: BLEU
270
+ - type: chrf
271
+ value: 0.71516
272
+ name: chr-F
273
+ - type: bleu
274
+ value: 54.3
275
+ name: BLEU
276
+ - type: chrf
277
+ value: 0.70274
278
+ name: chr-F
279
+ - type: bleu
280
+ value: 48.6
281
+ name: BLEU
282
+ - type: chrf
283
+ value: 0.66023
284
+ name: chr-F
285
+ - type: bleu
286
+ value: 23.2
287
+ name: BLEU
288
+ - type: chrf
289
+ value: 0.48058
290
+ name: chr-F
291
+ - type: bleu
292
+ value: 54.6
293
+ name: BLEU
294
+ - type: chrf
295
+ value: 0.7144
296
+ name: chr-F
297
+ - type: bleu
298
+ value: 56.5
299
+ name: BLEU
300
+ - type: chrf
301
+ value: 0.71995
302
+ name: chr-F
303
+ - type: bleu
304
+ value: 42.0
305
+ name: BLEU
306
+ - type: chrf
307
+ value: 0.63103
308
+ name: chr-F
309
+ - type: bleu
310
+ value: 21.3
311
+ name: BLEU
312
+ - type: chrf
313
+ value: 0.3858
314
+ name: chr-F
315
+ - type: bleu
316
+ value: 54.5
317
+ name: BLEU
318
+ - type: chrf
319
+ value: 0.71062
320
+ name: chr-F
321
+ - type: bleu
322
+ value: 25.1
323
+ name: BLEU
324
+ - type: chrf
325
+ value: 0.40545
326
+ name: chr-F
327
+ - type: bleu
328
+ value: 41.7
329
+ name: BLEU
330
+ - type: chrf
331
+ value: 0.55771
332
+ name: chr-F
333
+ - type: bleu
334
+ value: 25.4
335
+ name: BLEU
336
+ - type: chrf
337
+ value: 0.45302
338
+ name: chr-F
339
+ - type: bleu
340
+ value: 24.1
341
+ name: BLEU
342
+ - type: chrf
343
+ value: 0.37628
344
+ name: chr-F
345
+ - type: bleu
346
+ value: 26.2
347
+ name: BLEU
348
+ - type: chrf
349
+ value: 0.45777
350
+ name: chr-F
351
+ - type: bleu
352
+ value: 21.3
353
+ name: BLEU
354
+ - type: chrf
355
+ value: 0.37165
356
+ name: chr-F
357
+ - type: bleu
358
+ value: 30.3
359
+ name: BLEU
360
+ - type: chrf
361
+ value: 0.37784
362
+ name: chr-F
363
+ - type: bleu
364
+ value: 26.7
365
+ name: BLEU
366
+ - type: chrf
367
+ value: 0.32823
368
+ name: chr-F
369
+ - type: bleu
370
+ value: 45.4
371
+ name: BLEU
372
+ - type: chrf
373
+ value: 0.64008
374
+ name: chr-F
375
+ - type: bleu
376
+ value: 38.3
377
+ name: BLEU
378
+ - type: chrf
379
+ value: 0.55193
380
+ name: chr-F
381
+ - type: bleu
382
+ value: 50.0
383
+ name: BLEU
384
+ - type: chrf
385
+ value: 0.66943
386
+ name: chr-F
387
+ - type: bleu
388
+ value: 62.3
389
+ name: BLEU
390
+ - type: chrf
391
+ value: 0.7661
392
+ name: chr-F
393
+ - type: bleu
394
+ value: 56.8
395
+ name: BLEU
396
+ - type: chrf
397
+ value: 0.73162
398
+ name: chr-F
399
+ - type: bleu
400
+ value: 60.5
401
+ name: BLEU
402
+ - type: chrf
403
+ value: 0.74088
404
+ name: chr-F
405
+ - type: bleu
406
+ value: 31.4
407
+ name: BLEU
408
+ - type: chrf
409
+ value: 0.4846
410
+ name: chr-F
411
  - task:
 
412
  type: translation
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
413
  name: Translation deu-eng
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
414
  dataset:
415
  name: newstest2009
416
  type: wmt-2009-news
417
  args: deu-eng
418
  metrics:
419
+ - type: bleu
420
+ value: 25.9
421
+ name: BLEU
422
+ - type: chrf
423
+ value: 0.53747
424
+ name: chr-F
425
+ - type: bleu
426
+ value: 22.9
427
+ name: BLEU
428
+ - type: chrf
429
+ value: 0.53283
430
+ name: chr-F
431
  - task:
 
432
  type: translation
 
 
 
 
 
 
 
 
 
 
 
 
 
433
  name: Translation deu-eng
 
 
434
  dataset:
435
  name: newstest2010
436
  type: wmt-2010-news
437
  args: deu-eng
438
  metrics:
439
+ - type: bleu
440
+ value: 30.6
441
+ name: BLEU
442
+ - type: chrf
443
+ value: 0.58355
444
+ name: chr-F
445
+ - type: bleu
446
+ value: 25.8
447
+ name: BLEU
448
+ - type: chrf
449
+ value: 0.54885
450
+ name: chr-F
451
  - task:
 
452
  type: translation
 
 
 
 
 
 
 
 
 
 
 
 
 
453
  name: Translation deu-eng
 
 
454
  dataset:
455
  name: newstest2011
456
  type: wmt-2011-news
457
  args: deu-eng
458
  metrics:
459
+ - type: bleu
460
+ value: 26.3
461
+ name: BLEU
462
+ - type: chrf
463
+ value: 0.54883
464
+ name: chr-F
465
+ - type: bleu
466
+ value: 23.1
467
+ name: BLEU
468
+ - type: chrf
469
+ value: 0.52712
470
+ name: chr-F
471
  - task:
 
472
  type: translation
 
 
 
 
 
 
 
 
 
 
 
 
 
473
  name: Translation deu-eng
 
 
474
  dataset:
475
  name: newstest2012
476
  type: wmt-2012-news
477
  args: deu-eng
478
  metrics:
479
+ - type: bleu
480
+ value: 28.5
481
+ name: BLEU
482
+ - type: chrf
483
+ value: 0.56153
484
+ name: chr-F
485
+ - type: bleu
486
+ value: 23.3
487
+ name: BLEU
488
+ - type: chrf
489
+ value: 0.52662
490
+ name: chr-F
491
  - task:
 
492
  type: translation
 
 
 
 
 
 
 
 
 
 
 
 
 
493
  name: Translation deu-eng
 
 
494
  dataset:
495
  name: newstest2013
496
  type: wmt-2013-news
497
  args: deu-eng
498
  metrics:
499
+ - type: bleu
500
+ value: 31.4
501
+ name: BLEU
502
+ - type: chrf
503
+ value: 0.5777
504
+ name: chr-F
505
+ - type: bleu
506
+ value: 27.8
507
+ name: BLEU
508
+ - type: chrf
509
+ value: 0.55774
510
+ name: chr-F
511
  - task:
 
512
  type: translation
 
 
 
 
 
 
 
 
 
 
 
 
 
513
  name: Translation deu-eng
 
 
514
  dataset:
515
  name: newstest2014
516
  type: wmt-2014-news
517
  args: deu-eng
518
  metrics:
519
+ - type: bleu
520
+ value: 33.2
521
+ name: BLEU
522
+ - type: chrf
523
+ value: 0.59826
524
+ name: chr-F
525
+ - type: bleu
526
+ value: 29.0
527
+ name: BLEU
528
+ - type: chrf
529
+ value: 0.59301
530
+ name: chr-F
531
  - task:
 
532
  type: translation
 
 
 
 
 
 
 
 
 
 
 
 
 
533
  name: Translation deu-eng
 
 
534
  dataset:
535
  name: newstest2015
536
  type: wmt-2015-news
537
  args: deu-eng
538
  metrics:
539
+ - type: bleu
540
+ value: 33.4
541
+ name: BLEU
542
+ - type: chrf
543
+ value: 0.5966
544
+ name: chr-F
545
+ - type: bleu
546
+ value: 32.3
547
+ name: BLEU
548
+ - type: chrf
549
+ value: 0.59889
550
+ name: chr-F
551
  - task:
 
552
  type: translation
 
 
 
 
 
 
 
 
 
 
 
 
 
553
  name: Translation deu-eng
 
 
554
  dataset:
555
  name: newstest2016
556
  type: wmt-2016-news
557
  args: deu-eng
558
  metrics:
559
+ - type: bleu
560
+ value: 39.8
561
+ name: BLEU
562
+ - type: chrf
563
+ value: 0.64736
564
+ name: chr-F
565
+ - type: bleu
566
+ value: 38.3
567
+ name: BLEU
568
+ - type: chrf
569
+ value: 0.64427
570
+ name: chr-F
571
  - task:
 
572
  type: translation
 
 
 
 
 
 
 
 
 
 
 
 
 
573
  name: Translation deu-eng
 
 
574
  dataset:
575
  name: newstest2017
576
  type: wmt-2017-news
577
  args: deu-eng
578
  metrics:
579
+ - type: bleu
580
+ value: 35.2
581
+ name: BLEU
582
+ - type: chrf
583
+ value: 0.60933
584
+ name: chr-F
585
+ - type: bleu
586
+ value: 30.7
587
+ name: BLEU
588
+ - type: chrf
589
+ value: 0.59257
590
+ name: chr-F
591
  - task:
 
592
  type: translation
 
 
 
 
 
 
 
 
 
 
 
 
 
593
  name: Translation deu-eng
 
 
594
  dataset:
595
  name: newstest2018
596
  type: wmt-2018-news
597
  args: deu-eng
598
  metrics:
599
+ - type: bleu
600
+ value: 42.6
601
+ name: BLEU
602
+ - type: chrf
603
+ value: 0.66797
604
+ name: chr-F
605
+ - type: bleu
606
+ value: 46.5
607
+ name: BLEU
608
+ - type: chrf
609
+ value: 0.69605
610
+ name: chr-F
611
  - task:
 
612
  type: translation
 
 
 
 
 
 
 
 
 
 
 
 
 
613
  name: Translation deu-eng
 
 
614
  dataset:
615
  name: newstest2019
616
  type: wmt-2019-news
617
  args: deu-eng
618
  metrics:
619
+ - type: bleu
620
+ value: 39.7
621
+ name: BLEU
622
+ - type: chrf
623
+ value: 0.63749
624
+ name: chr-F
625
+ - type: bleu
626
+ value: 42.9
627
+ name: BLEU
628
+ - type: chrf
629
+ value: 0.66751
630
+ name: chr-F
631
  - task:
 
632
  type: translation
 
 
 
 
 
 
 
 
 
 
 
 
 
633
  name: Translation deu-eng
 
 
634
  dataset:
635
  name: newstest2020
636
  type: wmt-2020-news
637
  args: deu-eng
638
  metrics:
639
+ - type: bleu
640
+ value: 35.0
641
+ name: BLEU
642
+ - type: chrf
643
+ value: 0.612
644
+ name: chr-F
645
+ - type: bleu
646
+ value: 32.3
647
+ name: BLEU
648
+ - type: chrf
649
+ value: 0.60411
650
+ name: chr-F
 
 
 
 
 
 
 
 
 
651
  ---
652
  # opus-mt-tc-big-gmw-gmw
653
 
 
704
 
705
  src_text = [
706
  ">>nds<< Red keinen Quatsch.",
707
+ ">>eng<< Findet ihr das nicht etwas �bereilt?"
708
  ]
709
 
710
  model_name = "pytorch-models/opus-mt-tc-big-gmw-gmw"
 
839
 
840
  ## Citation Information
841
 
842
+ * Publications: [OPUS-MT Building open translation services for the World](https://aclanthology.org/2020.eamt-1.61/) and [The Tatoeba Translation Challenge Realistic Data Sets for Low Resource and Multilingual MT](https://aclanthology.org/2020.wmt-1.139/) (Please, cite if you use this model.)
843
 
844
  ```
845
  @inproceedings{tiedemann-thottingal-2020-opus,
 
869
 
870
  ## Acknowledgements
871
 
872
+ The work is supported by the [European Language Grid](https://www.european-language-grid.eu/) as [pilot project 2866](https://live.european-language-grid.eu/catalogue/#/resource/projects/2866), by the [FoTran project](https://www.helsinki.fi/en/researchgroups/natural-language-understanding-with-cross-lingual-grounding), funded by the European Research Council (ERC) under the European Unions Horizon 2020 research and innovation programme (grant agreement No 771113), and the [MeMAD project](https://memad.eu/), funded by the European Unions Horizon 2020 Research and Innovation Programme under grant agreement No 780069. We are also grateful for the generous computational resources and IT infrastructure provided by [CSC -- IT Center for Science](https://www.csc.fi/), Finland.
873
 
874
  ## Model conversion info
875