m-aliabbas1 commited on
Commit
85f5c2a
1 Parent(s): e1bfa31

End of training

Browse files
Files changed (1) hide show
  1. README.md +109 -0
README.md ADDED
@@ -0,0 +1,109 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ base_model: prajjwal1/bert-tiny
4
+ tags:
5
+ - generated_from_trainer
6
+ metrics:
7
+ - accuracy
8
+ model-index:
9
+ - name: tinybert_29_med_intents
10
+ results: []
11
+ ---
12
+
13
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
+ should probably proofread and complete it, then remove this comment. -->
15
+
16
+ # tinybert_29_med_intents
17
+
18
+ This model is a fine-tuned version of [prajjwal1/bert-tiny](https://huggingface.co/prajjwal1/bert-tiny) on the None dataset.
19
+ It achieves the following results on the evaluation set:
20
+ - Loss: 0.4559
21
+ - Accuracy: 0.9122
22
+
23
+ ## Model description
24
+
25
+ More information needed
26
+
27
+ ## Intended uses & limitations
28
+
29
+ More information needed
30
+
31
+ ## Training and evaluation data
32
+
33
+ More information needed
34
+
35
+ ## Training procedure
36
+
37
+ ### Training hyperparameters
38
+
39
+ The following hyperparameters were used during training:
40
+ - learning_rate: 2e-05
41
+ - train_batch_size: 16
42
+ - eval_batch_size: 16
43
+ - seed: 42
44
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
+ - lr_scheduler_type: linear
46
+ - num_epochs: 50
47
+
48
+ ### Training results
49
+
50
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy |
51
+ |:-------------:|:-----:|:-----:|:---------------:|:--------:|
52
+ | No log | 1.0 | 378 | 3.0359 | 0.3448 |
53
+ | 3.1662 | 2.0 | 756 | 2.7596 | 0.4953 |
54
+ | 2.7937 | 3.0 | 1134 | 2.4944 | 0.5141 |
55
+ | 2.4474 | 4.0 | 1512 | 2.2497 | 0.5674 |
56
+ | 2.4474 | 5.0 | 1890 | 2.0280 | 0.6207 |
57
+ | 2.1416 | 6.0 | 2268 | 1.8382 | 0.6646 |
58
+ | 1.8743 | 7.0 | 2646 | 1.6716 | 0.6740 |
59
+ | 1.6483 | 8.0 | 3024 | 1.5295 | 0.6959 |
60
+ | 1.6483 | 9.0 | 3402 | 1.4096 | 0.7304 |
61
+ | 1.4578 | 10.0 | 3780 | 1.3064 | 0.7304 |
62
+ | 1.3078 | 11.0 | 4158 | 1.2158 | 0.7524 |
63
+ | 1.1745 | 12.0 | 4536 | 1.1396 | 0.7555 |
64
+ | 1.1745 | 13.0 | 4914 | 1.0636 | 0.7837 |
65
+ | 1.0674 | 14.0 | 5292 | 1.0014 | 0.7931 |
66
+ | 0.9794 | 15.0 | 5670 | 0.9418 | 0.8119 |
67
+ | 0.8783 | 16.0 | 6048 | 0.8938 | 0.8307 |
68
+ | 0.8783 | 17.0 | 6426 | 0.8488 | 0.8401 |
69
+ | 0.8241 | 18.0 | 6804 | 0.8048 | 0.8370 |
70
+ | 0.7575 | 19.0 | 7182 | 0.7750 | 0.8401 |
71
+ | 0.7055 | 20.0 | 7560 | 0.7406 | 0.8433 |
72
+ | 0.7055 | 21.0 | 7938 | 0.7063 | 0.8589 |
73
+ | 0.6492 | 22.0 | 8316 | 0.6821 | 0.8527 |
74
+ | 0.6121 | 23.0 | 8694 | 0.6619 | 0.8589 |
75
+ | 0.5644 | 24.0 | 9072 | 0.6393 | 0.8683 |
76
+ | 0.5644 | 25.0 | 9450 | 0.6200 | 0.8683 |
77
+ | 0.5406 | 26.0 | 9828 | 0.5992 | 0.8746 |
78
+ | 0.5148 | 27.0 | 10206 | 0.5846 | 0.8809 |
79
+ | 0.4723 | 28.0 | 10584 | 0.5659 | 0.8934 |
80
+ | 0.4723 | 29.0 | 10962 | 0.5566 | 0.8934 |
81
+ | 0.4653 | 30.0 | 11340 | 0.5447 | 0.8966 |
82
+ | 0.4386 | 31.0 | 11718 | 0.5358 | 0.8997 |
83
+ | 0.4163 | 32.0 | 12096 | 0.5242 | 0.8997 |
84
+ | 0.4163 | 33.0 | 12474 | 0.5183 | 0.9028 |
85
+ | 0.404 | 34.0 | 12852 | 0.5113 | 0.9028 |
86
+ | 0.3849 | 35.0 | 13230 | 0.5005 | 0.9028 |
87
+ | 0.3677 | 36.0 | 13608 | 0.4966 | 0.9060 |
88
+ | 0.3677 | 37.0 | 13986 | 0.4908 | 0.9091 |
89
+ | 0.3652 | 38.0 | 14364 | 0.4843 | 0.9091 |
90
+ | 0.3533 | 39.0 | 14742 | 0.4784 | 0.9060 |
91
+ | 0.3362 | 40.0 | 15120 | 0.4733 | 0.9091 |
92
+ | 0.3362 | 41.0 | 15498 | 0.4703 | 0.9091 |
93
+ | 0.3403 | 42.0 | 15876 | 0.4668 | 0.9091 |
94
+ | 0.3268 | 43.0 | 16254 | 0.4642 | 0.9122 |
95
+ | 0.3229 | 44.0 | 16632 | 0.4642 | 0.9091 |
96
+ | 0.3177 | 45.0 | 17010 | 0.4606 | 0.9154 |
97
+ | 0.3177 | 46.0 | 17388 | 0.4575 | 0.9122 |
98
+ | 0.3137 | 47.0 | 17766 | 0.4574 | 0.9122 |
99
+ | 0.3067 | 48.0 | 18144 | 0.4562 | 0.9122 |
100
+ | 0.3054 | 49.0 | 18522 | 0.4561 | 0.9122 |
101
+ | 0.3054 | 50.0 | 18900 | 0.4559 | 0.9122 |
102
+
103
+
104
+ ### Framework versions
105
+
106
+ - Transformers 4.34.1
107
+ - Pytorch 2.1.0+cu118
108
+ - Datasets 2.14.6
109
+ - Tokenizers 0.14.1