muhammadravi251001 commited on
Commit
03c7c28
1 Parent(s): 46325c9

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +25 -27
README.md CHANGED
@@ -16,9 +16,9 @@ should probably proofread and complete it, then remove this comment. -->
16
 
17
  This model is a fine-tuned version of [indolem/indobert-base-uncased](https://huggingface.co/indolem/indobert-base-uncased) on the None dataset.
18
  It achieves the following results on the evaluation set:
19
- - Loss: 1.0942
20
- - Exact Match: 64.3979
21
- - F1: 69.8535
22
 
23
  ## Model description
24
 
@@ -38,10 +38,10 @@ More information needed
38
 
39
  The following hyperparameters were used during training:
40
  - learning_rate: 1e-05
41
- - train_batch_size: 8
42
- - eval_batch_size: 8
43
  - seed: 42
44
- - gradient_accumulation_steps: 16
45
  - total_train_batch_size: 128
46
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
47
  - lr_scheduler_type: linear
@@ -51,31 +51,29 @@ The following hyperparameters were used during training:
51
 
52
  | Training Loss | Epoch | Step | Validation Loss | Exact Match | F1 |
53
  |:-------------:|:-----:|:----:|:---------------:|:-----------:|:-------:|
54
- | 6.1478 | 0.49 | 36 | 2.4099 | 49.8691 | 49.8691 |
55
- | 3.581 | 0.98 | 72 | 1.9550 | 49.8691 | 49.8691 |
56
- | 2.195 | 1.48 | 108 | 1.8446 | 49.3455 | 49.8564 |
57
- | 2.195 | 1.97 | 144 | 1.7512 | 49.4764 | 51.2613 |
58
- | 2.0071 | 2.46 | 180 | 1.6324 | 49.6073 | 52.3994 |
59
- | 1.8105 | 2.95 | 216 | 1.5278 | 52.7487 | 55.8533 |
60
- | 1.6668 | 3.45 | 252 | 1.3938 | 56.6754 | 60.5142 |
61
- | 1.6668 | 3.94 | 288 | 1.3243 | 56.9372 | 62.8755 |
62
- | 1.4715 | 4.44 | 324 | 1.2475 | 60.6021 | 66.5376 |
63
- | 1.3112 | 4.93 | 360 | 1.2257 | 59.4241 | 65.0059 |
64
- | 1.3112 | 5.42 | 396 | 1.1793 | 60.9948 | 66.2895 |
65
- | 1.2443 | 5.91 | 432 | 1.1485 | 63.4817 | 69.0854 |
66
- | 1.1586 | 6.41 | 468 | 1.1178 | 64.1361 | 69.5844 |
67
- | 1.0895 | 6.9 | 504 | 1.1404 | 63.0890 | 68.6016 |
68
- | 1.0895 | 7.4 | 540 | 1.0862 | 65.7068 | 70.8093 |
69
- | 1.054 | 7.89 | 576 | 1.0959 | 64.7906 | 70.2001 |
70
- | 1.0231 | 8.38 | 612 | 1.1036 | 64.3979 | 69.7053 |
71
- | 1.0231 | 8.87 | 648 | 1.0698 | 65.8377 | 71.1488 |
72
- | 0.9985 | 9.37 | 684 | 1.0777 | 66.0995 | 71.3149 |
73
- | 0.9736 | 9.86 | 720 | 1.0942 | 64.3979 | 69.8535 |
74
 
75
 
76
  ### Framework versions
77
 
78
- - Transformers 4.26.1
79
  - Pytorch 1.13.1+cu117
80
  - Datasets 2.2.0
81
  - Tokenizers 0.13.2
 
16
 
17
  This model is a fine-tuned version of [indolem/indobert-base-uncased](https://huggingface.co/indolem/indobert-base-uncased) on the None dataset.
18
  It achieves the following results on the evaluation set:
19
+ - Loss: 1.1096
20
+ - Exact Match: 63.3508
21
+ - F1: 69.1464
22
 
23
  ## Model description
24
 
 
38
 
39
  The following hyperparameters were used during training:
40
  - learning_rate: 1e-05
41
+ - train_batch_size: 4
42
+ - eval_batch_size: 4
43
  - seed: 42
44
+ - gradient_accumulation_steps: 32
45
  - total_train_batch_size: 128
46
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
47
  - lr_scheduler_type: linear
 
51
 
52
  | Training Loss | Epoch | Step | Validation Loss | Exact Match | F1 |
53
  |:-------------:|:-----:|:----:|:---------------:|:-----------:|:-------:|
54
+ | 6.2958 | 0.49 | 36 | 2.4489 | 50.0 | 50.0 |
55
+ | 3.6359 | 0.98 | 72 | 1.9610 | 49.8691 | 49.8691 |
56
+ | 2.1589 | 1.47 | 108 | 1.8244 | 48.9529 | 49.9787 |
57
+ | 2.1589 | 1.96 | 144 | 1.7041 | 49.7382 | 51.8819 |
58
+ | 1.9535 | 2.46 | 180 | 1.5846 | 51.0471 | 56.3706 |
59
+ | 1.731 | 2.95 | 216 | 1.4596 | 54.0576 | 58.8577 |
60
+ | 1.5809 | 3.44 | 252 | 1.3590 | 56.1518 | 61.7069 |
61
+ | 1.5809 | 3.93 | 288 | 1.3205 | 56.2827 | 61.9772 |
62
+ | 1.4244 | 4.42 | 324 | 1.2688 | 55.8901 | 61.8344 |
63
+ | 1.2687 | 4.91 | 360 | 1.2379 | 58.9005 | 64.5444 |
64
+ | 1.2687 | 5.4 | 396 | 1.1637 | 62.4346 | 67.9125 |
65
+ | 1.1989 | 5.89 | 432 | 1.1675 | 60.7330 | 66.2963 |
66
+ | 1.1131 | 6.38 | 468 | 1.1321 | 62.5654 | 68.1655 |
67
+ | 1.0568 | 6.87 | 504 | 1.1155 | 62.9581 | 68.6094 |
68
+ | 1.0568 | 7.37 | 540 | 1.0895 | 64.1361 | 69.6097 |
69
+ | 1.0099 | 7.86 | 576 | 1.1013 | 63.2199 | 69.0324 |
70
+ | 0.9784 | 8.35 | 612 | 1.1117 | 63.8743 | 69.3380 |
71
+ | 0.9784 | 8.84 | 648 | 1.1096 | 63.3508 | 69.1464 |
 
 
72
 
73
 
74
  ### Framework versions
75
 
76
+ - Transformers 4.27.4
77
  - Pytorch 1.13.1+cu117
78
  - Datasets 2.2.0
79
  - Tokenizers 0.13.2