cherifkhalifah commited on
Commit
357c709
1 Parent(s): 481320f

End of training

Browse files
README.md ADDED
@@ -0,0 +1,161 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: Helsinki-NLP/opus-mt-en-ar
4
+ tags:
5
+ - generated_from_trainer
6
+ metrics:
7
+ - bleu
8
+ model-index:
9
+ - name: Tounsify-v0.7
10
+ results: []
11
+ ---
12
+
13
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
+ should probably proofread and complete it, then remove this comment. -->
15
+
16
+ # Tounsify-v0.7
17
+
18
+ This model is a fine-tuned version of [Helsinki-NLP/opus-mt-en-ar](https://huggingface.co/Helsinki-NLP/opus-mt-en-ar) on the None dataset.
19
+ It achieves the following results on the evaluation set:
20
+ - Loss: 2.4410
21
+ - Bleu: 24.686
22
+ - Gen Len: 7.1333
23
+
24
+ ## Model description
25
+
26
+ More information needed
27
+
28
+ ## Intended uses & limitations
29
+
30
+ More information needed
31
+
32
+ ## Training and evaluation data
33
+
34
+ More information needed
35
+
36
+ ## Training procedure
37
+
38
+ ### Training hyperparameters
39
+
40
+ The following hyperparameters were used during training:
41
+ - learning_rate: 2e-05
42
+ - train_batch_size: 16
43
+ - eval_batch_size: 16
44
+ - seed: 42
45
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
46
+ - lr_scheduler_type: linear
47
+ - num_epochs: 100
48
+ - mixed_precision_training: Native AMP
49
+
50
+ ### Training results
51
+
52
+ | Training Loss | Epoch | Step | Validation Loss | Bleu | Gen Len |
53
+ |:-------------:|:-----:|:----:|:---------------:|:-------:|:-------:|
54
+ | No log | 1.0 | 8 | 2.9143 | 16.1913 | 7.4 |
55
+ | No log | 2.0 | 16 | 2.6238 | 17.5534 | 6.9333 |
56
+ | No log | 3.0 | 24 | 2.4215 | 11.3684 | 6.3333 |
57
+ | No log | 4.0 | 32 | 2.2967 | 12.1601 | 6.3 |
58
+ | No log | 5.0 | 40 | 2.2459 | 12.4837 | 7.0333 |
59
+ | No log | 6.0 | 48 | 2.2333 | 13.2937 | 6.5667 |
60
+ | No log | 7.0 | 56 | 2.2264 | 18.1441 | 6.7667 |
61
+ | No log | 8.0 | 64 | 2.2167 | 14.5825 | 6.5333 |
62
+ | No log | 9.0 | 72 | 2.2064 | 15.1734 | 6.6667 |
63
+ | No log | 10.0 | 80 | 2.1951 | 14.6563 | 7.0333 |
64
+ | No log | 11.0 | 88 | 2.2060 | 19.0714 | 6.6333 |
65
+ | No log | 12.0 | 96 | 2.2088 | 21.5449 | 6.5333 |
66
+ | No log | 13.0 | 104 | 2.2517 | 21.4297 | 6.5333 |
67
+ | No log | 14.0 | 112 | 2.2584 | 24.6131 | 6.6 |
68
+ | No log | 15.0 | 120 | 2.2411 | 24.7358 | 6.6667 |
69
+ | No log | 16.0 | 128 | 2.2464 | 24.7358 | 6.6667 |
70
+ | No log | 17.0 | 136 | 2.2502 | 24.7358 | 6.6667 |
71
+ | No log | 18.0 | 144 | 2.2567 | 24.7358 | 6.6667 |
72
+ | No log | 19.0 | 152 | 2.2496 | 24.7358 | 6.6667 |
73
+ | No log | 20.0 | 160 | 2.2511 | 24.5996 | 6.8 |
74
+ | No log | 21.0 | 168 | 2.2668 | 24.5996 | 6.8 |
75
+ | No log | 22.0 | 176 | 2.2805 | 24.7358 | 6.6667 |
76
+ | No log | 23.0 | 184 | 2.2875 | 24.7358 | 6.6667 |
77
+ | No log | 24.0 | 192 | 2.2900 | 24.7358 | 6.6667 |
78
+ | No log | 25.0 | 200 | 2.2828 | 21.51 | 6.6667 |
79
+ | No log | 26.0 | 208 | 2.2676 | 21.51 | 6.6667 |
80
+ | No log | 27.0 | 216 | 2.2684 | 24.7358 | 6.6667 |
81
+ | No log | 28.0 | 224 | 2.2725 | 24.7358 | 6.6667 |
82
+ | No log | 29.0 | 232 | 2.2768 | 24.7358 | 6.6667 |
83
+ | No log | 30.0 | 240 | 2.2810 | 24.7358 | 6.6667 |
84
+ | No log | 31.0 | 248 | 2.2958 | 24.7358 | 6.6667 |
85
+ | No log | 32.0 | 256 | 2.3036 | 24.7358 | 6.6667 |
86
+ | No log | 33.0 | 264 | 2.3120 | 24.7358 | 6.7333 |
87
+ | No log | 34.0 | 272 | 2.3205 | 24.7358 | 6.7333 |
88
+ | No log | 35.0 | 280 | 2.3305 | 24.7358 | 6.7333 |
89
+ | No log | 36.0 | 288 | 2.3413 | 24.9721 | 6.7333 |
90
+ | No log | 37.0 | 296 | 2.3424 | 24.7358 | 6.7333 |
91
+ | No log | 38.0 | 304 | 2.3472 | 24.7358 | 6.7333 |
92
+ | No log | 39.0 | 312 | 2.3526 | 24.7358 | 6.7333 |
93
+ | No log | 40.0 | 320 | 2.3579 | 24.7358 | 6.7333 |
94
+ | No log | 41.0 | 328 | 2.3630 | 24.7358 | 6.6667 |
95
+ | No log | 42.0 | 336 | 2.3628 | 24.7358 | 6.6667 |
96
+ | No log | 43.0 | 344 | 2.3637 | 24.8163 | 6.7667 |
97
+ | No log | 44.0 | 352 | 2.3619 | 24.8163 | 6.7667 |
98
+ | No log | 45.0 | 360 | 2.3584 | 24.8163 | 6.7667 |
99
+ | No log | 46.0 | 368 | 2.3562 | 24.8163 | 6.7667 |
100
+ | No log | 47.0 | 376 | 2.3605 | 24.8163 | 6.7667 |
101
+ | No log | 48.0 | 384 | 2.3680 | 24.8163 | 6.8333 |
102
+ | No log | 49.0 | 392 | 2.3774 | 24.686 | 6.9667 |
103
+ | No log | 50.0 | 400 | 2.3819 | 24.686 | 6.9667 |
104
+ | No log | 51.0 | 408 | 2.3850 | 24.686 | 6.9667 |
105
+ | No log | 52.0 | 416 | 2.3902 | 24.686 | 6.9667 |
106
+ | No log | 53.0 | 424 | 2.3935 | 24.686 | 6.9667 |
107
+ | No log | 54.0 | 432 | 2.3969 | 24.686 | 6.9667 |
108
+ | No log | 55.0 | 440 | 2.3988 | 24.686 | 6.9667 |
109
+ | No log | 56.0 | 448 | 2.3992 | 24.686 | 6.9667 |
110
+ | No log | 57.0 | 456 | 2.3986 | 24.686 | 6.9667 |
111
+ | No log | 58.0 | 464 | 2.3983 | 24.686 | 6.9 |
112
+ | No log | 59.0 | 472 | 2.4000 | 24.686 | 6.9 |
113
+ | No log | 60.0 | 480 | 2.4009 | 24.686 | 6.9 |
114
+ | No log | 61.0 | 488 | 2.4009 | 24.686 | 6.9 |
115
+ | No log | 62.0 | 496 | 2.4012 | 24.686 | 7.0667 |
116
+ | 0.2188 | 63.0 | 504 | 2.4027 | 24.686 | 7.0667 |
117
+ | 0.2188 | 64.0 | 512 | 2.4056 | 24.686 | 7.0667 |
118
+ | 0.2188 | 65.0 | 520 | 2.4080 | 24.686 | 7.0667 |
119
+ | 0.2188 | 66.0 | 528 | 2.4085 | 24.686 | 7.0667 |
120
+ | 0.2188 | 67.0 | 536 | 2.4128 | 24.686 | 7.0667 |
121
+ | 0.2188 | 68.0 | 544 | 2.4168 | 24.686 | 7.0667 |
122
+ | 0.2188 | 69.0 | 552 | 2.4201 | 24.686 | 7.0667 |
123
+ | 0.2188 | 70.0 | 560 | 2.4218 | 24.686 | 6.9 |
124
+ | 0.2188 | 71.0 | 568 | 2.4229 | 24.686 | 7.0667 |
125
+ | 0.2188 | 72.0 | 576 | 2.4250 | 24.686 | 7.0667 |
126
+ | 0.2188 | 73.0 | 584 | 2.4261 | 24.686 | 7.0667 |
127
+ | 0.2188 | 74.0 | 592 | 2.4262 | 24.686 | 7.0667 |
128
+ | 0.2188 | 75.0 | 600 | 2.4287 | 24.686 | 7.0667 |
129
+ | 0.2188 | 76.0 | 608 | 2.4313 | 24.686 | 7.1333 |
130
+ | 0.2188 | 77.0 | 616 | 2.4294 | 24.686 | 7.1333 |
131
+ | 0.2188 | 78.0 | 624 | 2.4280 | 24.686 | 7.1333 |
132
+ | 0.2188 | 79.0 | 632 | 2.4266 | 24.686 | 7.0667 |
133
+ | 0.2188 | 80.0 | 640 | 2.4257 | 24.686 | 7.0667 |
134
+ | 0.2188 | 81.0 | 648 | 2.4256 | 24.686 | 7.0667 |
135
+ | 0.2188 | 82.0 | 656 | 2.4279 | 24.686 | 7.0667 |
136
+ | 0.2188 | 83.0 | 664 | 2.4312 | 24.686 | 7.0667 |
137
+ | 0.2188 | 84.0 | 672 | 2.4329 | 24.686 | 7.1333 |
138
+ | 0.2188 | 85.0 | 680 | 2.4329 | 24.686 | 7.1333 |
139
+ | 0.2188 | 86.0 | 688 | 2.4324 | 24.686 | 7.1333 |
140
+ | 0.2188 | 87.0 | 696 | 2.4326 | 24.686 | 7.0667 |
141
+ | 0.2188 | 88.0 | 704 | 2.4338 | 24.686 | 7.0667 |
142
+ | 0.2188 | 89.0 | 712 | 2.4343 | 24.686 | 7.0667 |
143
+ | 0.2188 | 90.0 | 720 | 2.4372 | 24.686 | 7.0667 |
144
+ | 0.2188 | 91.0 | 728 | 2.4386 | 24.686 | 7.1333 |
145
+ | 0.2188 | 92.0 | 736 | 2.4396 | 24.686 | 7.1333 |
146
+ | 0.2188 | 93.0 | 744 | 2.4403 | 24.686 | 7.1333 |
147
+ | 0.2188 | 94.0 | 752 | 2.4409 | 24.686 | 7.1333 |
148
+ | 0.2188 | 95.0 | 760 | 2.4416 | 24.686 | 7.1333 |
149
+ | 0.2188 | 96.0 | 768 | 2.4415 | 24.686 | 7.1333 |
150
+ | 0.2188 | 97.0 | 776 | 2.4410 | 24.686 | 7.1333 |
151
+ | 0.2188 | 98.0 | 784 | 2.4411 | 24.686 | 7.1333 |
152
+ | 0.2188 | 99.0 | 792 | 2.4407 | 24.686 | 7.1333 |
153
+ | 0.2188 | 100.0 | 800 | 2.4410 | 24.686 | 7.1333 |
154
+
155
+
156
+ ### Framework versions
157
+
158
+ - Transformers 4.41.2
159
+ - Pytorch 2.3.0+cu121
160
+ - Datasets 2.20.0
161
+ - Tokenizers 0.19.1
generation_config.json ADDED
@@ -0,0 +1,16 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bad_words_ids": [
3
+ [
4
+ 62801
5
+ ]
6
+ ],
7
+ "bos_token_id": 0,
8
+ "decoder_start_token_id": 62801,
9
+ "eos_token_id": 0,
10
+ "forced_eos_token_id": 0,
11
+ "max_length": 512,
12
+ "num_beams": 4,
13
+ "pad_token_id": 62801,
14
+ "renormalize_logits": true,
15
+ "transformers_version": "4.41.2"
16
+ }
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:7353006334307468e51cc445c801ab1490f5c3ee0e9872e202e827a9887bf2e2
3
  size 305452744
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f251efd5fa70f4f9306c3ad4a7fcd3a803f5657cd1ecf34b2cd6a5717c17e2fe
3
  size 305452744
runs/Jun26_14-45-54_443a7ec9c21d/events.out.tfevents.1719413155.443a7ec9c21d.18222.2 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:87660eb8b5e479713cce7a1e1bc7b28629f0a1b1107701cd9069db1cc574ae8a
3
- size 29106
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9142b5a424407eec936e4b4cbe2ff3325f45bfb97617d523124fde2ad1f39301
3
+ size 43150