RoBERTa-Base-SE2025T11A-sun-v20250108115409

This model is a fine-tuned version of w11wo/sundanese-roberta-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6184
  • F1 Macro: 0.6208
  • F1 Micro: 0.6202
  • F1 Weighted: 0.6194
  • F1 Samples: 0.6273
  • F1 Label Marah: 0.6095
  • F1 Label Jijik: 0.5636
  • F1 Label Takut: 0.5789
  • F1 Label Senang: 0.7907
  • F1 Label Sedih: 0.6420
  • F1 Label Terkejut: 0.5455
  • F1 Label Biasa: 0.6154

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 15

Training results

Training Loss Epoch Step Validation Loss F1 Macro F1 Micro F1 Weighted F1 Samples F1 Label Marah F1 Label Jijik F1 Label Takut F1 Label Senang F1 Label Sedih F1 Label Terkejut F1 Label Biasa
0.5025 0.1805 100 0.4545 0.0381 0.0539 0.0497 0.0345 0.1613 0.1053 0.0 0.0 0.0 0.0 0.0
0.4938 0.3610 200 0.4332 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.4204 0.5415 300 0.4173 0.1104 0.2042 0.1134 0.1276 0.0 0.0392 0.0 0.7333 0.0 0.0 0.0
0.4333 0.7220 400 0.3903 0.1978 0.2811 0.2130 0.1677 0.0357 0.3030 0.0 0.6747 0.0 0.3714 0.0
0.4342 0.9025 500 0.3825 0.2448 0.3387 0.2510 0.2523 0.1356 0.1818 0.5484 0.7527 0.0952 0.0 0.0
0.3831 1.0830 600 0.3514 0.3626 0.4293 0.3931 0.3386 0.4444 0.3544 0.4 0.7209 0.4444 0.1739 0.0
0.3454 1.2635 700 0.3686 0.4495 0.5185 0.4910 0.4575 0.5690 0.5225 0.5763 0.7105 0.2979 0.4706 0.0
0.3233 1.4440 800 0.3443 0.4102 0.4751 0.4400 0.4039 0.3143 0.4854 0.3256 0.7561 0.6349 0.3548 0.0
0.3341 1.6245 900 0.3174 0.5738 0.5861 0.5726 0.5318 0.6 0.425 0.6667 0.7792 0.6410 0.3860 0.5185
0.3311 1.8051 1000 0.3233 0.5625 0.5672 0.5512 0.4868 0.5476 0.2951 0.6780 0.7381 0.5797 0.5275 0.5714
0.3157 1.9856 1100 0.3160 0.4992 0.5796 0.5461 0.5578 0.6549 0.5625 0.5882 0.7742 0.4528 0.4615 0.0
0.2242 2.1661 1200 0.3282 0.5922 0.5938 0.5947 0.5722 0.5435 0.5607 0.6667 0.7778 0.5574 0.5393 0.5
0.2403 2.3466 1300 0.3081 0.5586 0.5909 0.5725 0.5541 0.6263 0.3881 0.6753 0.7848 0.5672 0.5352 0.3333
0.2142 2.5271 1400 0.3479 0.5597 0.5827 0.5710 0.5610 0.5287 0.5806 0.5385 0.7711 0.6301 0.4688 0.4
0.2091 2.7076 1500 0.3254 0.5764 0.5992 0.5869 0.5719 0.6804 0.4615 0.6774 0.7527 0.5 0.5185 0.4444
0.212 2.8881 1600 0.3223 0.5793 0.6154 0.6027 0.5988 0.6786 0.5814 0.6875 0.7529 0.5085 0.5263 0.32
0.1939 3.0686 1700 0.3372 0.6343 0.6274 0.6286 0.6168 0.6154 0.5349 0.6667 0.7733 0.6286 0.5714 0.65
0.1382 3.2491 1800 0.3555 0.6085 0.6090 0.5999 0.5952 0.6095 0.4658 0.5974 0.7912 0.5974 0.5316 0.6667
0.1494 3.4296 1900 0.3503 0.6094 0.6099 0.6089 0.6029 0.6383 0.5455 0.6296 0.75 0.5625 0.5352 0.6047
0.129 3.6101 2000 0.3766 0.6004 0.6106 0.6058 0.5970 0.5918 0.5057 0.6364 0.7901 0.6027 0.5934 0.4828
0.1182 3.7906 2100 0.3589 0.6283 0.6298 0.6276 0.6287 0.6731 0.5176 0.6765 0.7654 0.6176 0.5366 0.6111
0.143 3.9711 2200 0.3735 0.5977 0.6022 0.5984 0.6138 0.6306 0.5385 0.5556 0.7765 0.5814 0.5070 0.5946
0.1281 4.1516 2300 0.3730 0.6270 0.6306 0.6294 0.6273 0.6783 0.5417 0.5833 0.7711 0.6176 0.5909 0.6061
0.0804 4.3321 2400 0.4072 0.6253 0.6292 0.6277 0.6341 0.6604 0.5818 0.5867 0.8 0.6067 0.5352 0.6061
0.0829 4.5126 2500 0.4064 0.6110 0.6120 0.6140 0.6125 0.6306 0.5376 0.625 0.7222 0.6571 0.5542 0.55
0.1022 4.6931 2600 0.4271 0.6175 0.6218 0.6199 0.6228 0.6465 0.5825 0.6154 0.7556 0.5882 0.5432 0.5909
0.0707 4.8736 2700 0.4542 0.5901 0.5993 0.5936 0.6050 0.6408 0.5664 0.5455 0.7765 0.5312 0.4928 0.5778
0.0921 5.0542 2800 0.4340 0.5943 0.5982 0.5975 0.5992 0.6296 0.5714 0.5556 0.775 0.5758 0.4810 0.5714
0.0567 5.2347 2900 0.4351 0.5975 0.6098 0.6052 0.6092 0.6727 0.5227 0.5312 0.7674 0.6353 0.5238 0.5294
0.0567 5.4152 3000 0.4385 0.5999 0.6079 0.6068 0.6122 0.6604 0.5474 0.5714 0.7674 0.6133 0.5063 0.5333
0.0527 5.5957 3100 0.4416 0.6082 0.6140 0.6126 0.6150 0.6476 0.5684 0.6071 0.7586 0.6053 0.5122 0.5581
0.0611 5.7762 3200 0.4462 0.6147 0.6173 0.6153 0.6179 0.6667 0.5686 0.5352 0.7711 0.625 0.5 0.6364
0.0556 5.9567 3300 0.4505 0.6256 0.6264 0.6246 0.6195 0.6441 0.5474 0.5915 0.7654 0.6579 0.5479 0.625
0.046 6.1372 3400 0.4700 0.6178 0.6165 0.6159 0.6125 0.6542 0.5294 0.6410 0.75 0.5915 0.5333 0.625
0.0417 6.3177 3500 0.4695 0.6228 0.6255 0.6261 0.6296 0.66 0.5631 0.6333 0.7529 0.6 0.5714 0.5789
0.041 6.4982 3600 0.4619 0.6104 0.6162 0.6161 0.6047 0.6667 0.5111 0.64 0.75 0.6269 0.5517 0.5263
0.057 6.6787 3700 0.4954 0.5990 0.6051 0.6054 0.6164 0.6542 0.5306 0.6 0.7674 0.5647 0.5495 0.5263
0.0398 6.8592 3800 0.4771 0.6175 0.6211 0.6197 0.6167 0.6263 0.5333 0.6234 0.7711 0.6575 0.5526 0.5581
0.03 7.0397 3900 0.4944 0.6045 0.6101 0.6099 0.6123 0.6 0.5376 0.6410 0.7805 0.5870 0.5854 0.5
0.032 7.2202 4000 0.4920 0.6333 0.6381 0.6356 0.6497 0.6446 0.5849 0.5714 0.8 0.6410 0.5854 0.6061
0.0359 7.4007 4100 0.5033 0.6265 0.6241 0.6246 0.6302 0.6271 0.5577 0.6269 0.7805 0.5897 0.5783 0.625
0.0257 7.5812 4200 0.5155 0.6090 0.6206 0.6199 0.6225 0.6535 0.5872 0.6176 0.7955 0.5634 0.5581 0.4878
0.033 7.7617 4300 0.5402 0.6063 0.6109 0.6123 0.6146 0.6286 0.5593 0.6 0.7907 0.5897 0.5495 0.5263
0.0273 7.9422 4400 0.5259 0.6179 0.6190 0.6187 0.6272 0.6538 0.5143 0.5797 0.7907 0.6076 0.5789 0.6
0.0223 8.1227 4500 0.5211 0.6203 0.6248 0.6217 0.6291 0.6481 0.5055 0.6420 0.7907 0.6197 0.5647 0.5714
0.0193 8.3032 4600 0.5428 0.6230 0.6241 0.6246 0.6380 0.6230 0.5455 0.6230 0.8 0.6269 0.5714 0.5714
0.0238 8.4838 4700 0.5473 0.6188 0.6172 0.6170 0.6384 0.6107 0.5192 0.6389 0.7816 0.6154 0.5714 0.5946
0.021 8.6643 4800 0.5323 0.6220 0.6263 0.6265 0.6354 0.6034 0.6055 0.6 0.8 0.6588 0.5405 0.5455
0.0205 8.8448 4900 0.5348 0.6103 0.6132 0.6150 0.6224 0.6481 0.5517 0.6032 0.7561 0.6265 0.5366 0.55
0.017 9.0253 5000 0.5623 0.6290 0.625 0.6273 0.6471 0.6610 0.5664 0.5366 0.7765 0.6173 0.5783 0.6667
0.013 9.2058 5100 0.5400 0.6090 0.6157 0.6160 0.6203 0.6458 0.5794 0.5634 0.7765 0.5926 0.5647 0.5405
0.0174 9.3863 5200 0.5628 0.6281 0.6278 0.6273 0.6419 0.6379 0.5714 0.6 0.7816 0.6133 0.5641 0.6286
0.0139 9.5668 5300 0.5469 0.6134 0.6162 0.6139 0.6137 0.6139 0.5333 0.6316 0.7857 0.5977 0.56 0.5714
0.0139 9.7473 5400 0.5626 0.6235 0.6234 0.6234 0.6303 0.6316 0.5437 0.5867 0.7907 0.6575 0.55 0.6047
0.0174 9.9278 5500 0.5755 0.6210 0.6186 0.6194 0.6354 0.6087 0.5391 0.5753 0.7907 0.6582 0.5641 0.6111
0.0151 10.1083 5600 0.5767 0.6237 0.6231 0.6229 0.6362 0.6341 0.5686 0.575 0.7619 0.6410 0.5570 0.6286
0.0091 10.2888 5700 0.5831 0.6046 0.6073 0.6072 0.6171 0.6019 0.5545 0.5714 0.7711 0.6279 0.55 0.5556
0.0092 10.4693 5800 0.5728 0.6110 0.6124 0.6115 0.6203 0.6122 0.5437 0.5789 0.7765 0.6279 0.5526 0.5854
0.0093 10.6498 5900 0.5824 0.6150 0.6167 0.6156 0.6291 0.6379 0.5577 0.5570 0.7907 0.6053 0.5455 0.6111
0.0133 10.8303 6000 0.5804 0.6110 0.6151 0.6139 0.6255 0.6435 0.5455 0.5946 0.7765 0.6022 0.5432 0.5714
0.01 11.0108 6100 0.5945 0.6012 0.6095 0.6095 0.6107 0.6154 0.5660 0.5970 0.7907 0.6341 0.5176 0.4878
0.0061 11.1913 6200 0.5868 0.6241 0.6277 0.6264 0.6426 0.6372 0.5741 0.6061 0.7907 0.6301 0.5455 0.5854
0.0066 11.3718 6300 0.5882 0.6120 0.6197 0.6194 0.6302 0.6481 0.5688 0.5797 0.7907 0.64 0.5301 0.5263
0.0082 11.5523 6400 0.5876 0.6208 0.6218 0.6205 0.6378 0.6071 0.5926 0.56 0.7907 0.6341 0.5455 0.6154
0.0083 11.7329 6500 0.6084 0.6128 0.6143 0.6135 0.6270 0.6055 0.5841 0.5714 0.7907 0.6118 0.5316 0.5946
0.0049 11.9134 6600 0.6021 0.6105 0.6138 0.6137 0.6239 0.6168 0.5536 0.6061 0.7907 0.6316 0.525 0.55
0.0055 12.0939 6700 0.5952 0.6152 0.6218 0.6207 0.6338 0.6182 0.5905 0.5823 0.7907 0.6341 0.55 0.5405
0.0043 12.2744 6800 0.6066 0.6157 0.6201 0.6193 0.6333 0.6087 0.6018 0.5556 0.7907 0.6173 0.5647 0.5714
0.0045 12.4549 6900 0.6073 0.6083 0.6159 0.6154 0.6269 0.6182 0.5818 0.5405 0.7907 0.6329 0.5679 0.5263
0.0059 12.6354 7000 0.6100 0.6160 0.6194 0.6200 0.6297 0.6364 0.5421 0.575 0.7907 0.6579 0.5542 0.5556
0.0045 12.8159 7100 0.6128 0.6199 0.6254 0.6248 0.6462 0.6496 0.5841 0.5556 0.7907 0.6579 0.5301 0.5714
0.0045 12.9964 7200 0.6132 0.6176 0.6182 0.6183 0.6407 0.6271 0.5766 0.5455 0.7907 0.6420 0.5301 0.6111
0.0055 13.1769 7300 0.6144 0.6232 0.6265 0.6262 0.6422 0.6422 0.5893 0.6 0.7907 0.625 0.5301 0.5854
0.0037 13.3574 7400 0.6152 0.6218 0.6224 0.6213 0.6326 0.6226 0.5818 0.6 0.7907 0.6173 0.525 0.6154
0.0052 13.5379 7500 0.6199 0.6218 0.6215 0.6212 0.6324 0.6168 0.5766 0.5833 0.7765 0.6341 0.55 0.6154
0.0036 13.7184 7600 0.6182 0.6085 0.6066 0.6069 0.6143 0.6111 0.5347 0.5366 0.7765 0.6420 0.5432 0.6154
0.0045 13.8989 7700 0.6240 0.6099 0.6087 0.6093 0.6185 0.6154 0.5688 0.5570 0.7619 0.6076 0.5432 0.6154
0.0043 14.0794 7800 0.6167 0.6223 0.6231 0.6229 0.6353 0.6429 0.5660 0.5714 0.7765 0.6341 0.55 0.6154
0.0037 14.2599 7900 0.6143 0.6190 0.6202 0.6197 0.6272 0.6226 0.5714 0.5789 0.7907 0.6329 0.5366 0.6
0.0045 14.4404 8000 0.6141 0.6270 0.6270 0.6269 0.6363 0.6355 0.5607 0.5789 0.7907 0.6579 0.55 0.6154
0.0038 14.6209 8100 0.6180 0.6244 0.6248 0.6247 0.6372 0.6355 0.5688 0.5641 0.7907 0.6579 0.5385 0.6154
0.0034 14.8014 8200 0.6189 0.6207 0.6200 0.6193 0.6290 0.6095 0.5636 0.5789 0.7907 0.6410 0.5455 0.6154
0.0033 14.9819 8300 0.6184 0.6208 0.6202 0.6194 0.6273 0.6095 0.5636 0.5789 0.7907 0.6420 0.5455 0.6154

Framework versions

  • Transformers 4.47.1
  • Pytorch 2.5.1+cu121
  • Datasets 3.2.0
  • Tokenizers 0.21.0
Downloads last month
2
Safetensors
Model size
125M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for alxxtexxr/RoBERTa-Base-SE2025T11A-sun-v20250108115409

Finetuned
(12)
this model