RoBERTa-Base-SE2025T11A-sun-v20250111123127

This model is a fine-tuned version of w11wo/sundanese-roberta-base-emotion-classifier on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2025
  • F1 Macro: 0.7854
  • F1 Micro: 0.7849
  • F1 Weighted: 0.7826
  • F1 Samples: 0.7979
  • F1 Label Marah: 0.76
  • F1 Label Jijik: 0.7077
  • F1 Label Takut: 0.8271
  • F1 Label Senang: 0.8489
  • F1 Label Sedih: 0.8125
  • F1 Label Terkejut: 0.7463
  • F1 Label Biasa: 0.7955

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 2

Training results

Training Loss Epoch Step Validation Loss F1 Macro F1 Micro F1 Weighted F1 Samples F1 Label Marah F1 Label Jijik F1 Label Takut F1 Label Senang F1 Label Sedih F1 Label Terkejut F1 Label Biasa
0.4598 0.0615 100 0.4207 0.1498 0.1682 0.1502 0.1043 0.0 0.0 0.3333 0.3415 0.3736 0.0 0.0
0.4187 0.1229 200 0.3878 0.2292 0.3043 0.2263 0.2065 0.0 0.0 0.6154 0.7273 0.2619 0.0 0.0
0.3714 0.1844 300 0.3536 0.3304 0.4103 0.3360 0.3062 0.2340 0.0260 0.6349 0.7424 0.5741 0.1013 0.0
0.3921 0.2459 400 0.3424 0.3967 0.4617 0.4097 0.3830 0.2963 0.0741 0.6076 0.6667 0.6271 0.5049 0.0
0.3477 0.3073 500 0.3091 0.4466 0.5140 0.4500 0.4220 0.1163 0.1190 0.6429 0.7857 0.6667 0.6387 0.1569
0.3327 0.3688 600 0.2963 0.5397 0.5707 0.5325 0.4967 0.1379 0.3363 0.6607 0.8143 0.704 0.6610 0.4638
0.2976 0.4302 700 0.2937 0.5589 0.5972 0.5634 0.5337 0.5658 0.1839 0.6355 0.8054 0.6829 0.6723 0.3667
0.3519 0.4917 800 0.2988 0.5831 0.6233 0.5931 0.5851 0.6392 0.26 0.6095 0.8 0.7656 0.6861 0.3214
0.3027 0.5532 900 0.2769 0.6593 0.6607 0.6576 0.6478 0.5414 0.5290 0.6838 0.8143 0.7752 0.6713 0.6
0.3005 0.6146 1000 0.2860 0.6349 0.6359 0.6315 0.6232 0.5989 0.5124 0.6833 0.7770 0.6306 0.6126 0.6292
0.286 0.6761 1100 0.2782 0.6616 0.6651 0.6543 0.6466 0.512 0.5397 0.6606 0.8138 0.7907 0.6154 0.6990
0.2906 0.7376 1200 0.2692 0.6758 0.6721 0.6701 0.6544 0.6027 0.5397 0.7273 0.736 0.7402 0.6612 0.7234
0.2635 0.7990 1300 0.2740 0.6747 0.6765 0.6706 0.6649 0.6225 0.4696 0.7 0.7273 0.7910 0.7134 0.6988
0.2707 0.8605 1400 0.2666 0.6842 0.6842 0.6798 0.6835 0.6164 0.5496 0.7193 0.8163 0.6957 0.6977 0.6947
0.2577 0.9219 1500 0.2654 0.6803 0.6791 0.6753 0.6712 0.6176 0.5909 0.7304 0.8116 0.6545 0.6525 0.7045
0.2706 0.9834 1600 0.2431 0.7172 0.7195 0.7123 0.7241 0.7013 0.5391 0.7480 0.8227 0.7481 0.7087 0.7527
0.1988 1.0449 1700 0.2480 0.7168 0.7159 0.7133 0.7229 0.6853 0.5846 0.7368 0.7971 0.7419 0.7353 0.7368
0.1967 1.1063 1800 0.2546 0.7258 0.7233 0.7225 0.7278 0.6994 0.6377 0.7257 0.7794 0.7377 0.7429 0.7579
0.1776 1.1678 1900 0.2506 0.7204 0.7159 0.7137 0.7278 0.6324 0.6410 0.7009 0.8112 0.7377 0.7287 0.7912
0.2134 1.2293 2000 0.2525 0.7270 0.7254 0.7208 0.7278 0.6892 0.6364 0.7059 0.8212 0.7119 0.7244 0.8
0.1548 1.2907 2100 0.2526 0.7236 0.7240 0.7191 0.7315 0.6944 0.5873 0.7288 0.8252 0.7480 0.7244 0.7573
0.2253 1.3522 2200 0.2434 0.7398 0.7381 0.7344 0.7498 0.7007 0.6986 0.7154 0.8082 0.7119 0.7353 0.8081
0.173 1.4136 2300 0.2415 0.7482 0.7447 0.7427 0.7578 0.7083 0.6528 0.7130 0.8201 0.7581 0.7681 0.8172
0.1966 1.4751 2400 0.2363 0.7519 0.7497 0.7480 0.7644 0.7067 0.6906 0.7288 0.8175 0.7692 0.7586 0.7917
0.1677 1.5366 2500 0.2360 0.7363 0.7337 0.7316 0.7439 0.7037 0.6190 0.752 0.7786 0.7402 0.7692 0.7912
0.2415 1.5980 2600 0.2320 0.7436 0.7424 0.7389 0.7483 0.7133 0.6142 0.7419 0.8116 0.7581 0.7746 0.7912
0.1749 1.6595 2700 0.2327 0.7410 0.7381 0.7371 0.7544 0.7123 0.6753 0.7395 0.8028 0.7167 0.7576 0.7826
0.1499 1.7210 2800 0.2296 0.7573 0.7541 0.7544 0.7650 0.7037 0.7050 0.7563 0.8030 0.7597 0.7910 0.7826
0.1535 1.7824 2900 0.2274 0.7599 0.7589 0.7577 0.7696 0.7248 0.7194 0.7458 0.8296 0.7752 0.7538 0.7708
0.1818 1.8439 3000 0.2263 0.7524 0.7511 0.7487 0.7626 0.7020 0.6769 0.7438 0.8227 0.7619 0.7761 0.7835
0.179 1.9053 3100 0.2274 0.7596 0.7583 0.7569 0.7717 0.7152 0.7206 0.7563 0.8056 0.7480 0.7879 0.7835
0.1883 1.9668 3200 0.2253 0.7560 0.7544 0.7530 0.7684 0.7105 0.7111 0.7438 0.8056 0.768 0.7692 0.7835

Framework versions

  • Transformers 4.48.0
  • Pytorch 2.5.1+cu121
  • Datasets 3.2.0
  • Tokenizers 0.21.0
Downloads last month
0
Safetensors
Model size
125M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for alxxtexxr/RoBERTa-Base-SE2025T11A-sun-v20250111123127

Finetuned
(20)
this model