DrishtiSharma commited on
Commit
e45a091
1 Parent(s): 40447dd

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +160 -0
README.md ADDED
@@ -0,0 +1,160 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ tags:
4
+ - generated_from_trainer
5
+ metrics:
6
+ - accuracy
7
+ model-index:
8
+ - name: wav2vec2-base-finetuned-sentiment-mesd-v11
9
+ results: []
10
+ ---
11
+
12
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
13
+ should probably proofread and complete it, then remove this comment. -->
14
+
15
+ # wav2vec2-base-finetuned-sentiment-mesd-v11
16
+
17
+ This model is a fine-tuned version of [facebook/wav2vec2-base](https://huggingface.co/facebook/wav2vec2-base) on the None dataset.
18
+ It achieves the following results on the evaluation set:
19
+ - Loss: 0.3071
20
+ - Accuracy: 0.9308
21
+
22
+ ## Model description
23
+
24
+ More information needed
25
+
26
+ ## Intended uses & limitations
27
+
28
+ More information needed
29
+
30
+ ## Training and evaluation data
31
+
32
+ More information needed
33
+
34
+ ## Training procedure
35
+
36
+ ### Training hyperparameters
37
+
38
+ The following hyperparameters were used during training:
39
+ - learning_rate: 0.0001
40
+ - train_batch_size: 64
41
+ - eval_batch_size: 40
42
+ - seed: 42
43
+ - gradient_accumulation_steps: 4
44
+ - total_train_batch_size: 256
45
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
46
+ - lr_scheduler_type: linear
47
+ - num_epochs: 100
48
+
49
+ ### Training results
50
+
51
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy |
52
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|
53
+ | No log | 0.86 | 3 | 1.7516 | 0.3846 |
54
+ | 1.9428 | 1.86 | 6 | 1.6859 | 0.4308 |
55
+ | 1.9428 | 2.86 | 9 | 1.5575 | 0.4692 |
56
+ | 1.9629 | 3.86 | 12 | 1.4160 | 0.4846 |
57
+ | 1.5678 | 4.86 | 15 | 1.2979 | 0.5308 |
58
+ | 1.5678 | 5.86 | 18 | 1.2294 | 0.5308 |
59
+ | 1.4728 | 6.86 | 21 | 1.0703 | 0.5923 |
60
+ | 1.4728 | 7.86 | 24 | 0.9926 | 0.6308 |
61
+ | 1.2588 | 8.86 | 27 | 0.9202 | 0.6846 |
62
+ | 0.991 | 9.86 | 30 | 0.8537 | 0.6846 |
63
+ | 0.991 | 10.86 | 33 | 0.8816 | 0.6769 |
64
+ | 0.9059 | 11.86 | 36 | 0.7149 | 0.7769 |
65
+ | 0.9059 | 12.86 | 39 | 0.7676 | 0.7462 |
66
+ | 0.7901 | 13.86 | 42 | 0.6971 | 0.7538 |
67
+ | 0.6278 | 14.86 | 45 | 0.6671 | 0.7923 |
68
+ | 0.6278 | 15.86 | 48 | 0.5681 | 0.8231 |
69
+ | 0.5678 | 16.86 | 51 | 0.5535 | 0.8154 |
70
+ | 0.5678 | 17.86 | 54 | 0.5947 | 0.8077 |
71
+ | 0.5157 | 18.86 | 57 | 0.6396 | 0.7692 |
72
+ | 0.4189 | 19.86 | 60 | 0.5291 | 0.8077 |
73
+ | 0.4189 | 20.86 | 63 | 0.4600 | 0.8538 |
74
+ | 0.3885 | 21.86 | 66 | 0.5188 | 0.8308 |
75
+ | 0.3885 | 22.86 | 69 | 0.5959 | 0.7923 |
76
+ | 0.3255 | 23.86 | 72 | 0.5240 | 0.8462 |
77
+ | 0.2711 | 24.86 | 75 | 0.5105 | 0.8385 |
78
+ | 0.2711 | 25.86 | 78 | 0.5177 | 0.8231 |
79
+ | 0.2748 | 26.86 | 81 | 0.3302 | 0.8923 |
80
+ | 0.2748 | 27.86 | 84 | 0.4774 | 0.8538 |
81
+ | 0.2379 | 28.86 | 87 | 0.4204 | 0.8769 |
82
+ | 0.1982 | 29.86 | 90 | 0.6540 | 0.7692 |
83
+ | 0.1982 | 30.86 | 93 | 0.5664 | 0.8308 |
84
+ | 0.2171 | 31.86 | 96 | 0.5100 | 0.8462 |
85
+ | 0.2171 | 32.86 | 99 | 0.3924 | 0.8769 |
86
+ | 0.17 | 33.86 | 102 | 0.6002 | 0.8231 |
87
+ | 0.1761 | 34.86 | 105 | 0.4364 | 0.8538 |
88
+ | 0.1761 | 35.86 | 108 | 0.4166 | 0.8692 |
89
+ | 0.1703 | 36.86 | 111 | 0.4374 | 0.8692 |
90
+ | 0.1703 | 37.86 | 114 | 0.3872 | 0.8615 |
91
+ | 0.1569 | 38.86 | 117 | 0.3941 | 0.8538 |
92
+ | 0.1149 | 39.86 | 120 | 0.4004 | 0.8538 |
93
+ | 0.1149 | 40.86 | 123 | 0.4360 | 0.8385 |
94
+ | 0.1087 | 41.86 | 126 | 0.4387 | 0.8615 |
95
+ | 0.1087 | 42.86 | 129 | 0.4352 | 0.8692 |
96
+ | 0.1039 | 43.86 | 132 | 0.4018 | 0.8846 |
97
+ | 0.099 | 44.86 | 135 | 0.4019 | 0.8846 |
98
+ | 0.099 | 45.86 | 138 | 0.4083 | 0.8923 |
99
+ | 0.1043 | 46.86 | 141 | 0.4594 | 0.8692 |
100
+ | 0.1043 | 47.86 | 144 | 0.4478 | 0.8769 |
101
+ | 0.0909 | 48.86 | 147 | 0.5025 | 0.8538 |
102
+ | 0.1024 | 49.86 | 150 | 0.5442 | 0.8692 |
103
+ | 0.1024 | 50.86 | 153 | 0.3827 | 0.8769 |
104
+ | 0.1457 | 51.86 | 156 | 0.6816 | 0.8231 |
105
+ | 0.1457 | 52.86 | 159 | 0.3435 | 0.8923 |
106
+ | 0.1233 | 53.86 | 162 | 0.4418 | 0.8769 |
107
+ | 0.101 | 54.86 | 165 | 0.4629 | 0.8846 |
108
+ | 0.101 | 55.86 | 168 | 0.4616 | 0.8692 |
109
+ | 0.0969 | 56.86 | 171 | 0.3608 | 0.8923 |
110
+ | 0.0969 | 57.86 | 174 | 0.4867 | 0.8615 |
111
+ | 0.0981 | 58.86 | 177 | 0.4493 | 0.8692 |
112
+ | 0.0642 | 59.86 | 180 | 0.3841 | 0.8538 |
113
+ | 0.0642 | 60.86 | 183 | 0.4509 | 0.8769 |
114
+ | 0.0824 | 61.86 | 186 | 0.4477 | 0.8769 |
115
+ | 0.0824 | 62.86 | 189 | 0.4649 | 0.8615 |
116
+ | 0.0675 | 63.86 | 192 | 0.3492 | 0.9231 |
117
+ | 0.0839 | 64.86 | 195 | 0.3763 | 0.8846 |
118
+ | 0.0839 | 65.86 | 198 | 0.4475 | 0.8769 |
119
+ | 0.0677 | 66.86 | 201 | 0.4104 | 0.8923 |
120
+ | 0.0677 | 67.86 | 204 | 0.3071 | 0.9308 |
121
+ | 0.0626 | 68.86 | 207 | 0.3598 | 0.9077 |
122
+ | 0.0412 | 69.86 | 210 | 0.3771 | 0.8923 |
123
+ | 0.0412 | 70.86 | 213 | 0.4043 | 0.8846 |
124
+ | 0.0562 | 71.86 | 216 | 0.3696 | 0.9077 |
125
+ | 0.0562 | 72.86 | 219 | 0.3295 | 0.9077 |
126
+ | 0.0447 | 73.86 | 222 | 0.3616 | 0.8923 |
127
+ | 0.0727 | 74.86 | 225 | 0.3495 | 0.8923 |
128
+ | 0.0727 | 75.86 | 228 | 0.4330 | 0.8846 |
129
+ | 0.0576 | 76.86 | 231 | 0.5179 | 0.8923 |
130
+ | 0.0576 | 77.86 | 234 | 0.5544 | 0.8846 |
131
+ | 0.0489 | 78.86 | 237 | 0.4630 | 0.9 |
132
+ | 0.0472 | 79.86 | 240 | 0.4513 | 0.9 |
133
+ | 0.0472 | 80.86 | 243 | 0.4207 | 0.9077 |
134
+ | 0.0386 | 81.86 | 246 | 0.4118 | 0.8769 |
135
+ | 0.0386 | 82.86 | 249 | 0.4764 | 0.8769 |
136
+ | 0.0372 | 83.86 | 252 | 0.4167 | 0.8769 |
137
+ | 0.0344 | 84.86 | 255 | 0.3744 | 0.9077 |
138
+ | 0.0344 | 85.86 | 258 | 0.3712 | 0.9077 |
139
+ | 0.0459 | 86.86 | 261 | 0.4249 | 0.8846 |
140
+ | 0.0459 | 87.86 | 264 | 0.4687 | 0.8846 |
141
+ | 0.0364 | 88.86 | 267 | 0.4194 | 0.8923 |
142
+ | 0.0283 | 89.86 | 270 | 0.3963 | 0.8923 |
143
+ | 0.0283 | 90.86 | 273 | 0.3982 | 0.8923 |
144
+ | 0.0278 | 91.86 | 276 | 0.3838 | 0.9077 |
145
+ | 0.0278 | 92.86 | 279 | 0.3731 | 0.9 |
146
+ | 0.0352 | 93.86 | 282 | 0.3736 | 0.9 |
147
+ | 0.0297 | 94.86 | 285 | 0.3702 | 0.9 |
148
+ | 0.0297 | 95.86 | 288 | 0.3521 | 0.9154 |
149
+ | 0.0245 | 96.86 | 291 | 0.3522 | 0.9154 |
150
+ | 0.0245 | 97.86 | 294 | 0.3600 | 0.9077 |
151
+ | 0.0241 | 98.86 | 297 | 0.3636 | 0.9077 |
152
+ | 0.0284 | 99.86 | 300 | 0.3639 | 0.9077 |
153
+
154
+
155
+ ### Framework versions
156
+
157
+ - Transformers 4.17.0
158
+ - Pytorch 1.10.0+cu111
159
+ - Datasets 2.0.0
160
+ - Tokenizers 0.11.6