Vichentito commited on
Commit
c4f164b
1 Parent(s): 3d41a25

End of training

Browse files
README.md CHANGED
@@ -1,6 +1,4 @@
1
  ---
2
- license: apache-2.0
3
- base_model: Vichentito/Nahuatl_Espanol_v2
4
  tags:
5
  - generated_from_trainer
6
  metrics:
@@ -15,11 +13,11 @@ should probably proofread and complete it, then remove this comment. -->
15
 
16
  # Nahuatl_Espanol_vn
17
 
18
- This model is a fine-tuned version of [Vichentito/Nahuatl_Espanol_v2](https://huggingface.co/Vichentito/Nahuatl_Espanol_v2) on an unknown dataset.
19
  It achieves the following results on the evaluation set:
20
- - Loss: 1.4730
21
- - Bleu: 12.6156
22
- - Gen Len: 46.1122
23
 
24
  ## Model description
25
 
@@ -44,111 +42,45 @@ The following hyperparameters were used during training:
44
  - seed: 42
45
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
46
  - lr_scheduler_type: linear
47
- - num_epochs: 10
48
 
49
  ### Training results
50
 
51
  | Training Loss | Epoch | Step | Validation Loss | Bleu | Gen Len |
52
  |:-------------:|:------:|:----:|:---------------:|:-------:|:-------:|
53
- | No log | 0.1064 | 100 | 1.9442 | 6.1166 | 48.8218 |
54
- | No log | 0.2128 | 200 | 1.9259 | 6.1381 | 50.998 |
55
- | No log | 0.3191 | 300 | 1.9115 | 6.573 | 48.1678 |
56
- | No log | 0.4255 | 400 | 1.8855 | 6.4758 | 50.9589 |
57
- | 2.137 | 0.5319 | 500 | 1.8646 | 6.6628 | 50.1324 |
58
- | 2.137 | 0.6383 | 600 | 1.8440 | 6.8852 | 50.913 |
59
- | 2.137 | 0.7447 | 700 | 1.8311 | 6.879 | 49.7563 |
60
- | 2.137 | 0.8511 | 800 | 1.8105 | 7.268 | 48.5967 |
61
- | 2.137 | 0.9574 | 900 | 1.7907 | 7.5189 | 47.8909 |
62
- | 2.0246 | 1.0638 | 1000 | 1.7790 | 7.8039 | 49.7481 |
63
- | 2.0246 | 1.1702 | 1100 | 1.7751 | 7.8132 | 47.9985 |
64
- | 2.0246 | 1.2766 | 1200 | 1.7505 | 7.9468 | 47.9796 |
65
- | 2.0246 | 1.3830 | 1300 | 1.7378 | 8.1741 | 47.8028 |
66
- | 2.0246 | 1.4894 | 1400 | 1.7219 | 8.1614 | 48.2161 |
67
- | 1.8778 | 1.5957 | 1500 | 1.7178 | 8.3463 | 47.3984 |
68
- | 1.8778 | 1.7021 | 1600 | 1.7051 | 8.8493 | 48.1068 |
69
- | 1.8778 | 1.8085 | 1700 | 1.6907 | 8.5621 | 48.2402 |
70
- | 1.8778 | 1.9149 | 1800 | 1.6849 | 8.7522 | 49.7167 |
71
- | 1.8778 | 2.0213 | 1900 | 1.6738 | 8.9027 | 47.812 |
72
- | 1.7945 | 2.1277 | 2000 | 1.6718 | 9.323 | 47.1293 |
73
- | 1.7945 | 2.2340 | 2100 | 1.6619 | 9.1801 | 46.7211 |
74
- | 1.7945 | 2.3404 | 2200 | 1.6509 | 9.1763 | 47.085 |
75
- | 1.7945 | 2.4468 | 2300 | 1.6394 | 9.2575 | 47.9275 |
76
- | 1.7945 | 2.5532 | 2400 | 1.6388 | 9.5591 | 47.2517 |
77
- | 1.7164 | 2.6596 | 2500 | 1.6336 | 9.5656 | 47.996 |
78
- | 1.7164 | 2.7660 | 2600 | 1.6205 | 9.767 | 47.4039 |
79
- | 1.7164 | 2.8723 | 2700 | 1.6152 | 9.5891 | 47.2867 |
80
- | 1.7164 | 2.9787 | 2800 | 1.6074 | 9.7122 | 47.3419 |
81
- | 1.7164 | 3.0851 | 2900 | 1.6122 | 10.1634 | 47.2597 |
82
- | 1.6476 | 3.1915 | 3000 | 1.6016 | 10.0543 | 47.7276 |
83
- | 1.6476 | 3.2979 | 3100 | 1.5939 | 9.8821 | 47.8567 |
84
- | 1.6476 | 3.4043 | 3200 | 1.5922 | 10.1382 | 47.8498 |
85
- | 1.6476 | 3.5106 | 3300 | 1.5808 | 10.1617 | 46.7866 |
86
- | 1.6476 | 3.6170 | 3400 | 1.5780 | 10.2872 | 47.1357 |
87
- | 1.5916 | 3.7234 | 3500 | 1.5713 | 10.3594 | 47.6514 |
88
- | 1.5916 | 3.8298 | 3600 | 1.5657 | 10.3745 | 46.9836 |
89
- | 1.5916 | 3.9362 | 3700 | 1.5594 | 10.5178 | 46.7624 |
90
- | 1.5916 | 4.0426 | 3800 | 1.5704 | 10.665 | 46.6844 |
91
- | 1.5916 | 4.1489 | 3900 | 1.5589 | 10.6936 | 47.1421 |
92
- | 1.5475 | 4.2553 | 4000 | 1.5541 | 10.7949 | 46.8528 |
93
- | 1.5475 | 4.3617 | 4100 | 1.5481 | 10.631 | 47.3707 |
94
- | 1.5475 | 4.4681 | 4200 | 1.5468 | 10.8283 | 46.5979 |
95
- | 1.5475 | 4.5745 | 4300 | 1.5403 | 10.9811 | 47.1724 |
96
- | 1.5475 | 4.6809 | 4400 | 1.5356 | 11.0659 | 46.682 |
97
- | 1.4988 | 4.7872 | 4500 | 1.5379 | 11.0334 | 46.9275 |
98
- | 1.4988 | 4.8936 | 4600 | 1.5257 | 10.9602 | 46.5027 |
99
- | 1.4988 | 5.0 | 4700 | 1.5260 | 11.1289 | 46.8976 |
100
- | 1.4988 | 5.1064 | 4800 | 1.5311 | 11.1567 | 46.4451 |
101
- | 1.4988 | 5.2128 | 4900 | 1.5274 | 11.3486 | 46.6272 |
102
- | 1.4535 | 5.3191 | 5000 | 1.5259 | 11.2413 | 46.9351 |
103
- | 1.4535 | 5.4255 | 5100 | 1.5215 | 11.3214 | 46.8237 |
104
- | 1.4535 | 5.5319 | 5200 | 1.5129 | 11.4718 | 47.1328 |
105
- | 1.4535 | 5.6383 | 5300 | 1.5125 | 11.4864 | 46.6589 |
106
- | 1.4535 | 5.7447 | 5400 | 1.5121 | 11.5694 | 46.5577 |
107
- | 1.4219 | 5.8511 | 5500 | 1.5036 | 11.6487 | 46.5487 |
108
- | 1.4219 | 5.9574 | 5600 | 1.5000 | 11.5189 | 46.5733 |
109
- | 1.4219 | 6.0638 | 5700 | 1.5075 | 11.5882 | 46.5391 |
110
- | 1.4219 | 6.1702 | 5800 | 1.5096 | 11.7659 | 46.1593 |
111
- | 1.4219 | 6.2766 | 5900 | 1.5083 | 11.5189 | 46.4194 |
112
- | 1.3736 | 6.3830 | 6000 | 1.4987 | 11.7254 | 46.3748 |
113
- | 1.3736 | 6.4894 | 6100 | 1.4974 | 11.709 | 46.7318 |
114
- | 1.3736 | 6.5957 | 6200 | 1.4940 | 11.7516 | 46.5484 |
115
- | 1.3736 | 6.7021 | 6300 | 1.4918 | 11.828 | 46.4844 |
116
- | 1.3736 | 6.8085 | 6400 | 1.4933 | 11.9539 | 46.5024 |
117
- | 1.3705 | 6.9149 | 6500 | 1.4856 | 11.8196 | 46.6158 |
118
- | 1.3705 | 7.0213 | 6600 | 1.4959 | 11.8671 | 46.5148 |
119
- | 1.3705 | 7.1277 | 6700 | 1.4959 | 11.9404 | 46.1803 |
120
- | 1.3705 | 7.2340 | 6800 | 1.4974 | 12.0784 | 46.2473 |
121
- | 1.3705 | 7.3404 | 6900 | 1.4882 | 12.3014 | 46.27 |
122
- | 1.3223 | 7.4468 | 7000 | 1.4813 | 12.0859 | 46.5862 |
123
- | 1.3223 | 7.5532 | 7100 | 1.4846 | 12.1787 | 46.0993 |
124
- | 1.3223 | 7.6596 | 7200 | 1.4853 | 12.1633 | 46.142 |
125
- | 1.3223 | 7.7660 | 7300 | 1.4811 | 12.1962 | 46.4309 |
126
- | 1.3223 | 7.8723 | 7400 | 1.4819 | 12.1183 | 46.0882 |
127
- | 1.3154 | 7.9787 | 7500 | 1.4757 | 12.2428 | 46.2431 |
128
- | 1.3154 | 8.0851 | 7600 | 1.4811 | 12.2027 | 46.3626 |
129
- | 1.3154 | 8.1915 | 7700 | 1.4803 | 12.3011 | 46.328 |
130
- | 1.3154 | 8.2979 | 7800 | 1.4830 | 12.2846 | 46.3101 |
131
- | 1.3154 | 8.4043 | 7900 | 1.4808 | 12.3297 | 45.987 |
132
- | 1.2766 | 8.5106 | 8000 | 1.4789 | 12.3831 | 46.1575 |
133
- | 1.2766 | 8.6170 | 8100 | 1.4774 | 12.4203 | 46.2323 |
134
- | 1.2766 | 8.7234 | 8200 | 1.4737 | 12.5194 | 46.2774 |
135
- | 1.2766 | 8.8298 | 8300 | 1.4738 | 12.3472 | 46.2114 |
136
- | 1.2766 | 8.9362 | 8400 | 1.4687 | 12.3894 | 46.3324 |
137
- | 1.2752 | 9.0426 | 8500 | 1.4748 | 12.4876 | 46.0959 |
138
- | 1.2752 | 9.1489 | 8600 | 1.4792 | 12.597 | 45.985 |
139
- | 1.2752 | 9.2553 | 8700 | 1.4761 | 12.5547 | 46.2209 |
140
- | 1.2752 | 9.3617 | 8800 | 1.4759 | 12.5615 | 46.0812 |
141
- | 1.2752 | 9.4681 | 8900 | 1.4752 | 12.5736 | 46.1437 |
142
- | 1.2454 | 9.5745 | 9000 | 1.4765 | 12.5976 | 46.0358 |
143
- | 1.2454 | 9.6809 | 9100 | 1.4745 | 12.5204 | 46.1139 |
144
- | 1.2454 | 9.7872 | 9200 | 1.4735 | 12.5765 | 46.107 |
145
- | 1.2454 | 9.8936 | 9300 | 1.4732 | 12.5875 | 46.1734 |
146
- | 1.2454 | 10.0 | 9400 | 1.4730 | 12.6156 | 46.1122 |
147
 
148
 
149
  ### Framework versions
150
 
151
- - Transformers 4.40.2
152
- - Pytorch 2.2.1+cu121
153
  - Datasets 2.19.1
154
  - Tokenizers 0.19.1
 
1
  ---
 
 
2
  tags:
3
  - generated_from_trainer
4
  metrics:
 
13
 
14
  # Nahuatl_Espanol_vn
15
 
16
+ This model was trained from scratch on an unknown dataset.
17
  It achieves the following results on the evaluation set:
18
+ - Loss: 1.1464
19
+ - Bleu: 15.4218
20
+ - Gen Len: 45.5239
21
 
22
  ## Model description
23
 
 
42
  - seed: 42
43
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
44
  - lr_scheduler_type: linear
45
+ - num_epochs: 3
46
 
47
  ### Training results
48
 
49
  | Training Loss | Epoch | Step | Validation Loss | Bleu | Gen Len |
50
  |:-------------:|:------:|:----:|:---------------:|:-------:|:-------:|
51
+ | No log | 0.1064 | 100 | 1.1525 | 14.738 | 46.3599 |
52
+ | No log | 0.2128 | 200 | 1.1682 | 14.2823 | 45.9297 |
53
+ | No log | 0.3191 | 300 | 1.1739 | 14.2118 | 46.4243 |
54
+ | No log | 0.4255 | 400 | 1.1799 | 14.3198 | 45.9266 |
55
+ | 1.3984 | 0.5319 | 500 | 1.1771 | 14.0972 | 46.2179 |
56
+ | 1.3984 | 0.6383 | 600 | 1.1752 | 14.4083 | 45.8709 |
57
+ | 1.3984 | 0.7447 | 700 | 1.1756 | 14.1914 | 46.0949 |
58
+ | 1.3984 | 0.8511 | 800 | 1.1761 | 14.4131 | 46.0528 |
59
+ | 1.3984 | 0.9574 | 900 | 1.1727 | 14.1957 | 46.4856 |
60
+ | 1.3826 | 1.0638 | 1000 | 1.1768 | 14.7451 | 45.7873 |
61
+ | 1.3826 | 1.1702 | 1100 | 1.1727 | 14.6016 | 45.8654 |
62
+ | 1.3826 | 1.2766 | 1200 | 1.1726 | 14.6549 | 45.6857 |
63
+ | 1.3826 | 1.3830 | 1300 | 1.1693 | 14.586 | 45.6052 |
64
+ | 1.3826 | 1.4894 | 1400 | 1.1704 | 14.6483 | 45.6039 |
65
+ | 1.2932 | 1.5957 | 1500 | 1.1638 | 14.921 | 45.5508 |
66
+ | 1.2932 | 1.7021 | 1600 | 1.1649 | 14.7977 | 45.3693 |
67
+ | 1.2932 | 1.8085 | 1700 | 1.1580 | 14.9676 | 45.7072 |
68
+ | 1.2932 | 1.9149 | 1800 | 1.1567 | 14.794 | 45.5877 |
69
+ | 1.2932 | 2.0213 | 1900 | 1.1607 | 15.3066 | 45.677 |
70
+ | 1.2612 | 2.1277 | 2000 | 1.1569 | 15.1152 | 45.4122 |
71
+ | 1.2612 | 2.2340 | 2100 | 1.1553 | 15.2526 | 45.4026 |
72
+ | 1.2612 | 2.3404 | 2200 | 1.1521 | 15.2022 | 45.3518 |
73
+ | 1.2612 | 2.4468 | 2300 | 1.1505 | 15.3072 | 45.5873 |
74
+ | 1.2612 | 2.5532 | 2400 | 1.1500 | 15.417 | 45.5906 |
75
+ | 1.2095 | 2.6596 | 2500 | 1.1507 | 15.394 | 45.4383 |
76
+ | 1.2095 | 2.7660 | 2600 | 1.1501 | 15.4171 | 45.4846 |
77
+ | 1.2095 | 2.8723 | 2700 | 1.1472 | 15.4497 | 45.5049 |
78
+ | 1.2095 | 2.9787 | 2800 | 1.1464 | 15.4218 | 45.5239 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
79
 
80
 
81
  ### Framework versions
82
 
83
+ - Transformers 4.41.0
84
+ - Pytorch 2.3.0+cu121
85
  - Datasets 2.19.1
86
  - Tokenizers 0.19.1
generation_config.json CHANGED
@@ -2,5 +2,5 @@
2
  "decoder_start_token_id": 0,
3
  "eos_token_id": 1,
4
  "pad_token_id": 0,
5
- "transformers_version": "4.40.2"
6
  }
 
2
  "decoder_start_token_id": 0,
3
  "eos_token_id": 1,
4
  "pad_token_id": 0,
5
+ "transformers_version": "4.41.0"
6
  }
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:085df151370a55b0b3b5522779b3eed5cc9574af9e2d8e37b999134b84c2ef2e
3
  size 990345064
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c9807813013210f77dc02f9aa13c5f44a6955a8fe6d7349e931c9ef4c3a8e5da
3
  size 990345064
runs/May27_23-40-42_a52f934a4766/events.out.tfevents.1716853243.a52f934a4766.789.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:d27a760ff4afec067f24ebdd1434068ffbf3f56f782ed00bea6bce9147dfc714
3
- size 16219
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:04a984e27e97bd53c198a820ae046a5a221b793601cb0a3d5ca1154a21442e4f
3
+ size 17683