gArthur98 commited on
Commit
40ac54a
·
1 Parent(s): 6e69952

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +9 -97
README.md CHANGED
@@ -16,8 +16,8 @@ should probably proofread and complete it, then remove this comment. -->
16
 
17
  This model is a fine-tuned version of [microsoft/MiniLM-L12-H384-uncased](https://huggingface.co/microsoft/MiniLM-L12-H384-uncased) on an unknown dataset.
18
  It achieves the following results on the evaluation set:
19
- - Loss: 0.7486
20
- - F1: 0.7008
21
 
22
  ## Model description
23
 
@@ -37,8 +37,8 @@ More information needed
37
 
38
  The following hyperparameters were used during training:
39
  - learning_rate: 5e-05
40
- - train_batch_size: 16
41
- - eval_batch_size: 16
42
  - seed: 42
43
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
44
  - lr_scheduler_type: linear
@@ -48,99 +48,11 @@ The following hyperparameters were used during training:
48
 
49
  | Training Loss | Epoch | Step | Validation Loss | F1 |
50
  |:-------------:|:-----:|:----:|:---------------:|:------:|
51
- | 1.0524 | 0.03 | 16 | 1.0197 | 0.2347 |
52
- | 1.0397 | 0.06 | 32 | 1.0153 | 0.3222 |
53
- | 1.0087 | 0.1 | 48 | 1.0201 | 0.2347 |
54
- | 0.9802 | 0.13 | 64 | 1.0013 | 0.2347 |
55
- | 0.949 | 0.16 | 80 | 0.9152 | 0.6212 |
56
- | 0.9329 | 0.19 | 96 | 0.8657 | 0.6436 |
57
- | 0.8902 | 0.22 | 112 | 0.8973 | 0.6240 |
58
- | 0.8901 | 0.26 | 128 | 0.8596 | 0.6516 |
59
- | 0.8262 | 0.29 | 144 | 0.8360 | 0.6539 |
60
- | 0.8605 | 0.32 | 160 | 0.8547 | 0.6398 |
61
- | 0.9094 | 0.35 | 176 | 0.8283 | 0.6541 |
62
- | 0.8707 | 0.38 | 192 | 0.8901 | 0.6295 |
63
- | 0.8547 | 0.42 | 208 | 0.8316 | 0.6540 |
64
- | 0.8568 | 0.45 | 224 | 0.8321 | 0.6548 |
65
- | 0.8083 | 0.48 | 240 | 0.8215 | 0.6593 |
66
- | 0.7565 | 0.51 | 256 | 0.8303 | 0.6563 |
67
- | 0.8115 | 0.55 | 272 | 0.8323 | 0.6495 |
68
- | 0.8178 | 0.58 | 288 | 0.8197 | 0.6617 |
69
- | 0.8053 | 0.61 | 304 | 0.8088 | 0.6558 |
70
- | 0.8391 | 0.64 | 320 | 0.7971 | 0.6645 |
71
- | 0.801 | 0.67 | 336 | 0.8174 | 0.6653 |
72
- | 0.7834 | 0.71 | 352 | 0.8067 | 0.6670 |
73
- | 0.9651 | 0.74 | 368 | 0.8145 | 0.6583 |
74
- | 0.8784 | 0.77 | 384 | 0.7989 | 0.6562 |
75
- | 0.7697 | 0.8 | 400 | 0.7972 | 0.6598 |
76
- | 0.8052 | 0.83 | 416 | 0.8148 | 0.6652 |
77
- | 0.8751 | 0.87 | 432 | 0.8164 | 0.6598 |
78
- | 0.7129 | 0.9 | 448 | 0.8133 | 0.6584 |
79
- | 0.8936 | 0.93 | 464 | 0.8270 | 0.6589 |
80
- | 0.8094 | 0.96 | 480 | 0.8134 | 0.6693 |
81
- | 0.766 | 0.99 | 496 | 0.8108 | 0.6500 |
82
- | 0.7925 | 1.03 | 512 | 0.8031 | 0.6724 |
83
- | 0.8091 | 1.06 | 528 | 0.7930 | 0.6566 |
84
- | 0.7489 | 1.09 | 544 | 0.7804 | 0.6726 |
85
- | 0.758 | 1.12 | 560 | 0.7905 | 0.6747 |
86
- | 0.7724 | 1.15 | 576 | 0.8150 | 0.6554 |
87
- | 0.7907 | 1.19 | 592 | 0.8257 | 0.6664 |
88
- | 0.7511 | 1.22 | 608 | 0.8143 | 0.6633 |
89
- | 0.8002 | 1.25 | 624 | 0.8200 | 0.6595 |
90
- | 0.6899 | 1.28 | 640 | 0.7943 | 0.6696 |
91
- | 0.7124 | 1.31 | 656 | 0.7859 | 0.6686 |
92
- | 0.8041 | 1.35 | 672 | 0.7914 | 0.6679 |
93
- | 0.7368 | 1.38 | 688 | 0.7879 | 0.6695 |
94
- | 0.762 | 1.41 | 704 | 0.7787 | 0.6728 |
95
- | 0.818 | 1.44 | 720 | 0.7835 | 0.6630 |
96
- | 0.8053 | 1.47 | 736 | 0.8011 | 0.6709 |
97
- | 0.7205 | 1.51 | 752 | 0.7788 | 0.6724 |
98
- | 0.7612 | 1.54 | 768 | 0.7954 | 0.6777 |
99
- | 0.7609 | 1.57 | 784 | 0.7807 | 0.6731 |
100
- | 0.8042 | 1.6 | 800 | 0.8015 | 0.6438 |
101
- | 0.8042 | 1.64 | 816 | 0.7597 | 0.6742 |
102
- | 0.7646 | 1.67 | 832 | 0.7603 | 0.6734 |
103
- | 0.6935 | 1.7 | 848 | 0.7777 | 0.6703 |
104
- | 0.6546 | 1.73 | 864 | 0.7795 | 0.6723 |
105
- | 0.813 | 1.76 | 880 | 0.7659 | 0.6698 |
106
- | 0.8416 | 1.8 | 896 | 0.7576 | 0.6988 |
107
- | 0.7131 | 1.83 | 912 | 0.7542 | 0.6787 |
108
- | 0.712 | 1.86 | 928 | 0.7733 | 0.6735 |
109
- | 0.8067 | 1.89 | 944 | 0.7587 | 0.6826 |
110
- | 0.7756 | 1.92 | 960 | 0.7638 | 0.6720 |
111
- | 0.7431 | 1.96 | 976 | 0.7625 | 0.6944 |
112
- | 0.7488 | 1.99 | 992 | 0.7541 | 0.6960 |
113
- | 0.6909 | 2.02 | 1008 | 0.7633 | 0.6945 |
114
- | 0.6786 | 2.05 | 1024 | 0.7695 | 0.6970 |
115
- | 0.6199 | 2.08 | 1040 | 0.7905 | 0.6788 |
116
- | 0.6761 | 2.12 | 1056 | 0.7850 | 0.6654 |
117
- | 0.7423 | 2.15 | 1072 | 0.7879 | 0.6713 |
118
- | 0.7749 | 2.18 | 1088 | 0.7580 | 0.7003 |
119
- | 0.7067 | 2.21 | 1104 | 0.7533 | 0.6960 |
120
- | 0.6976 | 2.24 | 1120 | 0.7576 | 0.6855 |
121
- | 0.6828 | 2.28 | 1136 | 0.7568 | 0.6978 |
122
- | 0.7357 | 2.31 | 1152 | 0.7606 | 0.6890 |
123
- | 0.6657 | 2.34 | 1168 | 0.7696 | 0.6856 |
124
- | 0.669 | 2.37 | 1184 | 0.7668 | 0.6935 |
125
- | 0.7595 | 2.4 | 1200 | 0.7564 | 0.7025 |
126
- | 0.7173 | 2.44 | 1216 | 0.7534 | 0.7058 |
127
- | 0.6722 | 2.47 | 1232 | 0.7553 | 0.6940 |
128
- | 0.6664 | 2.5 | 1248 | 0.7550 | 0.7053 |
129
- | 0.6947 | 2.53 | 1264 | 0.7519 | 0.7032 |
130
- | 0.7022 | 2.57 | 1280 | 0.7565 | 0.6977 |
131
- | 0.6904 | 2.6 | 1296 | 0.7560 | 0.6949 |
132
- | 0.7168 | 2.63 | 1312 | 0.7511 | 0.6994 |
133
- | 0.7475 | 2.66 | 1328 | 0.7533 | 0.6933 |
134
- | 0.6228 | 2.69 | 1344 | 0.7498 | 0.7011 |
135
- | 0.6361 | 2.73 | 1360 | 0.7565 | 0.6950 |
136
- | 0.7083 | 2.76 | 1376 | 0.7541 | 0.6990 |
137
- | 0.738 | 2.79 | 1392 | 0.7539 | 0.6984 |
138
- | 0.6655 | 2.82 | 1408 | 0.7612 | 0.6923 |
139
- | 0.6693 | 2.85 | 1424 | 0.7539 | 0.6991 |
140
- | 0.7481 | 2.89 | 1440 | 0.7545 | 0.6959 |
141
- | 0.7843 | 2.92 | 1456 | 0.7511 | 0.6954 |
142
- | 0.644 | 2.95 | 1472 | 0.7488 | 0.6998 |
143
- | 0.6741 | 2.98 | 1488 | 0.7486 | 0.7008 |
144
 
145
 
146
  ### Framework versions
 
16
 
17
  This model is a fine-tuned version of [microsoft/MiniLM-L12-H384-uncased](https://huggingface.co/microsoft/MiniLM-L12-H384-uncased) on an unknown dataset.
18
  It achieves the following results on the evaluation set:
19
+ - Loss: 0.7772
20
+ - F1: 0.6668
21
 
22
  ## Model description
23
 
 
37
 
38
  The following hyperparameters were used during training:
39
  - learning_rate: 5e-05
40
+ - train_batch_size: 8
41
+ - eval_batch_size: 8
42
  - seed: 42
43
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
44
  - lr_scheduler_type: linear
 
48
 
49
  | Training Loss | Epoch | Step | Validation Loss | F1 |
50
  |:-------------:|:-----:|:----:|:---------------:|:------:|
51
+ | 0.9021 | 0.5 | 500 | 0.8769 | 0.6405 |
52
+ | 0.8309 | 1.0 | 1000 | 0.8563 | 0.6533 |
53
+ | 0.7813 | 1.5 | 1500 | 0.8374 | 0.6543 |
54
+ | 0.7747 | 2.01 | 2000 | 0.7932 | 0.6565 |
55
+ | 0.7376 | 2.51 | 2500 | 0.7772 | 0.6668 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
56
 
57
 
58
  ### Framework versions