hubert-classifier-aug-fold-3
This model is a fine-tuned version of facebook/hubert-base-ls960 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.4622
- Accuracy: 0.8962
- Precision: 0.9090
- Recall: 0.8962
- F1: 0.8963
- Binary: 0.9276
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 30
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 | Binary |
---|---|---|---|---|---|---|---|---|
No log | 0.22 | 50 | 3.8684 | 0.0648 | 0.0195 | 0.0648 | 0.0184 | 0.3348 |
No log | 0.43 | 100 | 3.4085 | 0.0891 | 0.0169 | 0.0891 | 0.0238 | 0.3588 |
No log | 0.65 | 150 | 3.1493 | 0.1107 | 0.0376 | 0.1107 | 0.0422 | 0.3746 |
No log | 0.86 | 200 | 2.8408 | 0.1822 | 0.1026 | 0.1822 | 0.1043 | 0.4273 |
3.6116 | 1.08 | 250 | 2.6147 | 0.2780 | 0.1840 | 0.2780 | 0.1853 | 0.4906 |
3.6116 | 1.29 | 300 | 2.4272 | 0.2982 | 0.2217 | 0.2982 | 0.2156 | 0.5023 |
3.6116 | 1.51 | 350 | 2.1582 | 0.4103 | 0.3356 | 0.4103 | 0.3247 | 0.5852 |
3.6116 | 1.72 | 400 | 2.0102 | 0.4143 | 0.3454 | 0.4143 | 0.3447 | 0.5892 |
3.6116 | 1.94 | 450 | 1.8796 | 0.4615 | 0.4433 | 0.4615 | 0.4066 | 0.6225 |
2.6485 | 2.16 | 500 | 1.6625 | 0.5412 | 0.5251 | 0.5412 | 0.4856 | 0.6771 |
2.6485 | 2.37 | 550 | 1.5422 | 0.5843 | 0.5780 | 0.5843 | 0.5307 | 0.7078 |
2.6485 | 2.59 | 600 | 1.4263 | 0.6073 | 0.5806 | 0.6073 | 0.5573 | 0.7246 |
2.6485 | 2.8 | 650 | 1.2985 | 0.6451 | 0.6244 | 0.6451 | 0.6039 | 0.7503 |
2.1146 | 3.02 | 700 | 1.2788 | 0.6613 | 0.6564 | 0.6613 | 0.6273 | 0.7614 |
2.1146 | 3.23 | 750 | 1.2186 | 0.6802 | 0.6820 | 0.6802 | 0.6499 | 0.7742 |
2.1146 | 3.45 | 800 | 1.1269 | 0.7152 | 0.7428 | 0.7152 | 0.6978 | 0.7991 |
2.1146 | 3.66 | 850 | 1.1179 | 0.6680 | 0.6970 | 0.6680 | 0.6377 | 0.7675 |
2.1146 | 3.88 | 900 | 1.0928 | 0.7031 | 0.7264 | 0.7031 | 0.6793 | 0.7889 |
1.8074 | 4.09 | 950 | 0.9427 | 0.7638 | 0.7780 | 0.7638 | 0.7513 | 0.8350 |
1.8074 | 4.31 | 1000 | 0.8876 | 0.7692 | 0.7952 | 0.7692 | 0.7603 | 0.8382 |
1.8074 | 4.53 | 1050 | 0.8686 | 0.7773 | 0.7852 | 0.7773 | 0.7664 | 0.8425 |
1.8074 | 4.74 | 1100 | 0.8814 | 0.7665 | 0.7770 | 0.7665 | 0.7497 | 0.8363 |
1.8074 | 4.96 | 1150 | 0.8280 | 0.7706 | 0.7896 | 0.7706 | 0.7603 | 0.8401 |
1.5857 | 5.17 | 1200 | 0.8050 | 0.7773 | 0.8023 | 0.7773 | 0.7720 | 0.8439 |
1.5857 | 5.39 | 1250 | 0.7475 | 0.8016 | 0.8114 | 0.8016 | 0.7976 | 0.8609 |
1.5857 | 5.6 | 1300 | 0.7396 | 0.7895 | 0.8187 | 0.7895 | 0.7859 | 0.8521 |
1.5857 | 5.82 | 1350 | 0.7637 | 0.8030 | 0.8177 | 0.8030 | 0.7953 | 0.8598 |
1.4411 | 6.03 | 1400 | 0.7511 | 0.7976 | 0.8157 | 0.7976 | 0.7934 | 0.8574 |
1.4411 | 6.25 | 1450 | 0.6479 | 0.8232 | 0.8392 | 0.8232 | 0.8185 | 0.8756 |
1.4411 | 6.47 | 1500 | 0.6521 | 0.8286 | 0.8494 | 0.8286 | 0.8233 | 0.8803 |
1.4411 | 6.68 | 1550 | 0.5778 | 0.8529 | 0.8637 | 0.8529 | 0.8501 | 0.8962 |
1.4411 | 6.9 | 1600 | 0.5898 | 0.8259 | 0.8428 | 0.8259 | 0.8249 | 0.8776 |
1.3162 | 7.11 | 1650 | 0.5784 | 0.8421 | 0.8614 | 0.8421 | 0.8404 | 0.8892 |
1.3162 | 7.33 | 1700 | 0.6395 | 0.8232 | 0.8407 | 0.8232 | 0.8170 | 0.8764 |
1.3162 | 7.54 | 1750 | 0.6334 | 0.8340 | 0.8519 | 0.8340 | 0.8320 | 0.8834 |
1.3162 | 7.76 | 1800 | 0.6133 | 0.8286 | 0.8513 | 0.8286 | 0.8274 | 0.8798 |
1.3162 | 7.97 | 1850 | 0.5488 | 0.8502 | 0.8663 | 0.8502 | 0.8496 | 0.8949 |
1.2312 | 8.19 | 1900 | 0.6521 | 0.8246 | 0.8411 | 0.8246 | 0.8227 | 0.8769 |
1.2312 | 8.41 | 1950 | 0.5706 | 0.8529 | 0.8669 | 0.8529 | 0.8528 | 0.8962 |
1.2312 | 8.62 | 2000 | 0.5822 | 0.8462 | 0.8596 | 0.8462 | 0.8448 | 0.8924 |
1.2312 | 8.84 | 2050 | 0.5332 | 0.8502 | 0.8646 | 0.8502 | 0.8498 | 0.8953 |
1.1409 | 9.05 | 2100 | 0.5226 | 0.8650 | 0.8743 | 0.8650 | 0.8631 | 0.9053 |
1.1409 | 9.27 | 2150 | 0.5451 | 0.8623 | 0.8750 | 0.8623 | 0.8617 | 0.9032 |
1.1409 | 9.48 | 2200 | 0.5940 | 0.8381 | 0.8510 | 0.8381 | 0.8365 | 0.8860 |
1.1409 | 9.7 | 2250 | 0.5303 | 0.8570 | 0.8686 | 0.8570 | 0.8568 | 0.8988 |
1.1409 | 9.91 | 2300 | 0.5706 | 0.8448 | 0.8622 | 0.8448 | 0.8429 | 0.8912 |
1.0865 | 10.13 | 2350 | 0.5140 | 0.8623 | 0.8780 | 0.8623 | 0.8635 | 0.9026 |
1.0865 | 10.34 | 2400 | 0.5106 | 0.8704 | 0.8811 | 0.8704 | 0.8692 | 0.9092 |
1.0865 | 10.56 | 2450 | 0.5478 | 0.8583 | 0.8753 | 0.8583 | 0.8570 | 0.9005 |
1.0865 | 10.78 | 2500 | 0.6036 | 0.8583 | 0.8694 | 0.8583 | 0.8548 | 0.9003 |
1.0865 | 10.99 | 2550 | 0.5360 | 0.8543 | 0.8712 | 0.8543 | 0.8498 | 0.8984 |
1.0383 | 11.21 | 2600 | 0.5426 | 0.8570 | 0.8691 | 0.8570 | 0.8558 | 0.8982 |
1.0383 | 11.42 | 2650 | 0.5124 | 0.8691 | 0.8777 | 0.8691 | 0.8673 | 0.9067 |
1.0383 | 11.64 | 2700 | 0.5676 | 0.8435 | 0.8554 | 0.8435 | 0.8422 | 0.8892 |
1.0383 | 11.85 | 2750 | 0.5387 | 0.8596 | 0.8700 | 0.8596 | 0.8590 | 0.9022 |
0.9938 | 12.07 | 2800 | 0.5402 | 0.8691 | 0.8778 | 0.8691 | 0.8675 | 0.9089 |
0.9938 | 12.28 | 2850 | 0.5814 | 0.8529 | 0.8603 | 0.8529 | 0.8496 | 0.8969 |
0.9938 | 12.5 | 2900 | 0.5124 | 0.8623 | 0.8705 | 0.8623 | 0.8594 | 0.9034 |
0.9938 | 12.72 | 2950 | 0.5077 | 0.8623 | 0.8739 | 0.8623 | 0.8604 | 0.9032 |
0.9938 | 12.93 | 3000 | 0.5305 | 0.8704 | 0.8785 | 0.8704 | 0.8675 | 0.9101 |
0.9526 | 13.15 | 3050 | 0.5455 | 0.8718 | 0.8849 | 0.8718 | 0.8707 | 0.9100 |
0.9526 | 13.36 | 3100 | 0.5153 | 0.8826 | 0.8939 | 0.8826 | 0.8822 | 0.9175 |
0.9526 | 13.58 | 3150 | 0.5218 | 0.8826 | 0.8902 | 0.8826 | 0.8813 | 0.9167 |
0.9526 | 13.79 | 3200 | 0.5361 | 0.8637 | 0.8756 | 0.8637 | 0.8634 | 0.9030 |
0.91 | 14.01 | 3250 | 0.5174 | 0.8785 | 0.8873 | 0.8785 | 0.8780 | 0.9139 |
0.91 | 14.22 | 3300 | 0.5346 | 0.8799 | 0.8892 | 0.8799 | 0.8787 | 0.9158 |
0.91 | 14.44 | 3350 | 0.5586 | 0.8650 | 0.8747 | 0.8650 | 0.8634 | 0.9050 |
0.91 | 14.66 | 3400 | 0.5504 | 0.8704 | 0.8816 | 0.8704 | 0.8698 | 0.9097 |
0.91 | 14.87 | 3450 | 0.5643 | 0.8718 | 0.8814 | 0.8718 | 0.8700 | 0.9101 |
0.8689 | 15.09 | 3500 | 0.5425 | 0.8650 | 0.8766 | 0.8650 | 0.8642 | 0.9043 |
0.8689 | 15.3 | 3550 | 0.5609 | 0.8623 | 0.8775 | 0.8623 | 0.8616 | 0.9038 |
0.8689 | 15.52 | 3600 | 0.5440 | 0.8745 | 0.8847 | 0.8745 | 0.8739 | 0.9116 |
0.8689 | 15.73 | 3650 | 0.5020 | 0.8718 | 0.8814 | 0.8718 | 0.8714 | 0.9103 |
0.8689 | 15.95 | 3700 | 0.5650 | 0.8718 | 0.8810 | 0.8718 | 0.8704 | 0.9099 |
0.8437 | 16.16 | 3750 | 0.5115 | 0.8785 | 0.8874 | 0.8785 | 0.8774 | 0.9146 |
0.8437 | 16.38 | 3800 | 0.5651 | 0.8596 | 0.8735 | 0.8596 | 0.8592 | 0.9022 |
0.8437 | 16.59 | 3850 | 0.4996 | 0.8920 | 0.9025 | 0.8920 | 0.8921 | 0.9242 |
0.8437 | 16.81 | 3900 | 0.5528 | 0.8772 | 0.8887 | 0.8772 | 0.8765 | 0.9134 |
0.8213 | 17.03 | 3950 | 0.5568 | 0.8677 | 0.8816 | 0.8677 | 0.8666 | 0.9074 |
0.8213 | 17.24 | 4000 | 0.5270 | 0.8812 | 0.8906 | 0.8812 | 0.8804 | 0.9167 |
0.8213 | 17.46 | 4050 | 0.5239 | 0.8812 | 0.8922 | 0.8812 | 0.8800 | 0.9162 |
0.8213 | 17.67 | 4100 | 0.4915 | 0.8839 | 0.8921 | 0.8839 | 0.8834 | 0.9181 |
0.8213 | 17.89 | 4150 | 0.5282 | 0.8812 | 0.8914 | 0.8812 | 0.8807 | 0.9152 |
0.7835 | 18.1 | 4200 | 0.5031 | 0.8866 | 0.8959 | 0.8866 | 0.8865 | 0.9194 |
0.7835 | 18.32 | 4250 | 0.4997 | 0.8812 | 0.8898 | 0.8812 | 0.8803 | 0.9158 |
0.7835 | 18.53 | 4300 | 0.5080 | 0.8826 | 0.8904 | 0.8826 | 0.8809 | 0.9167 |
0.7835 | 18.75 | 4350 | 0.5264 | 0.8812 | 0.8898 | 0.8812 | 0.8800 | 0.9158 |
0.7835 | 18.97 | 4400 | 0.5487 | 0.8718 | 0.8808 | 0.8718 | 0.8707 | 0.9105 |
0.7606 | 19.18 | 4450 | 0.5266 | 0.8772 | 0.8877 | 0.8772 | 0.8759 | 0.9139 |
0.7606 | 19.4 | 4500 | 0.5257 | 0.8772 | 0.8875 | 0.8772 | 0.8770 | 0.9139 |
0.7606 | 19.61 | 4550 | 0.5321 | 0.8880 | 0.8977 | 0.8880 | 0.8882 | 0.9215 |
0.7606 | 19.83 | 4600 | 0.5349 | 0.8772 | 0.8880 | 0.8772 | 0.8765 | 0.9139 |
0.7342 | 20.04 | 4650 | 0.5250 | 0.8880 | 0.8962 | 0.8880 | 0.8877 | 0.9219 |
0.7342 | 20.26 | 4700 | 0.5081 | 0.8907 | 0.8990 | 0.8907 | 0.8904 | 0.9232 |
0.7342 | 20.47 | 4750 | 0.4958 | 0.8839 | 0.8941 | 0.8839 | 0.8842 | 0.9171 |
0.7342 | 20.69 | 4800 | 0.5293 | 0.8826 | 0.8928 | 0.8826 | 0.8819 | 0.9181 |
0.7342 | 20.91 | 4850 | 0.5094 | 0.8812 | 0.8924 | 0.8812 | 0.8805 | 0.9167 |
0.7129 | 21.12 | 4900 | 0.4922 | 0.8920 | 0.8997 | 0.8920 | 0.8908 | 0.9242 |
0.7129 | 21.34 | 4950 | 0.5078 | 0.8907 | 0.9000 | 0.8907 | 0.8901 | 0.9238 |
0.7129 | 21.55 | 5000 | 0.5303 | 0.8799 | 0.8892 | 0.8799 | 0.8781 | 0.9167 |
0.7129 | 21.77 | 5050 | 0.5531 | 0.8731 | 0.8842 | 0.8731 | 0.8711 | 0.9115 |
0.7129 | 21.98 | 5100 | 0.5572 | 0.8799 | 0.8920 | 0.8799 | 0.8784 | 0.9158 |
0.7032 | 22.2 | 5150 | 0.5151 | 0.8799 | 0.8903 | 0.8799 | 0.8793 | 0.9167 |
0.7032 | 22.41 | 5200 | 0.5090 | 0.8812 | 0.8921 | 0.8812 | 0.8808 | 0.9177 |
0.7032 | 22.63 | 5250 | 0.5318 | 0.8799 | 0.8891 | 0.8799 | 0.8785 | 0.9158 |
0.7032 | 22.84 | 5300 | 0.5114 | 0.8826 | 0.8897 | 0.8826 | 0.8812 | 0.9171 |
0.6809 | 23.06 | 5350 | 0.5049 | 0.8866 | 0.8946 | 0.8866 | 0.8858 | 0.9209 |
0.6809 | 23.28 | 5400 | 0.5378 | 0.8799 | 0.8901 | 0.8799 | 0.8786 | 0.9152 |
0.6809 | 23.49 | 5450 | 0.5088 | 0.8812 | 0.8905 | 0.8812 | 0.8806 | 0.9158 |
0.6809 | 23.71 | 5500 | 0.4883 | 0.8920 | 0.9033 | 0.8920 | 0.8925 | 0.9252 |
0.6809 | 23.92 | 5550 | 0.5168 | 0.8799 | 0.8911 | 0.8799 | 0.8800 | 0.9152 |
0.6604 | 24.14 | 5600 | 0.5167 | 0.8799 | 0.8907 | 0.8799 | 0.8795 | 0.9148 |
0.6604 | 24.35 | 5650 | 0.5092 | 0.8866 | 0.9011 | 0.8866 | 0.8878 | 0.9200 |
0.6604 | 24.57 | 5700 | 0.5048 | 0.8961 | 0.9069 | 0.8961 | 0.8965 | 0.9270 |
0.6604 | 24.78 | 5750 | 0.5303 | 0.8839 | 0.8973 | 0.8839 | 0.8835 | 0.9186 |
0.6604 | 25.0 | 5800 | 0.4996 | 0.8934 | 0.9041 | 0.8934 | 0.8939 | 0.9242 |
0.6595 | 25.22 | 5850 | 0.5095 | 0.8934 | 0.9033 | 0.8934 | 0.8927 | 0.9242 |
0.6595 | 25.43 | 5900 | 0.5109 | 0.8920 | 0.9024 | 0.8920 | 0.8921 | 0.9232 |
0.6595 | 25.65 | 5950 | 0.4993 | 0.8893 | 0.8973 | 0.8893 | 0.8890 | 0.9219 |
0.6595 | 25.86 | 6000 | 0.4954 | 0.8934 | 0.9022 | 0.8934 | 0.8928 | 0.9247 |
0.6347 | 26.08 | 6050 | 0.4939 | 0.8988 | 0.9076 | 0.8988 | 0.8986 | 0.9279 |
0.6347 | 26.29 | 6100 | 0.4820 | 0.8974 | 0.9049 | 0.8974 | 0.8970 | 0.9270 |
0.6347 | 26.51 | 6150 | 0.5168 | 0.8880 | 0.8952 | 0.8880 | 0.8869 | 0.9205 |
0.6347 | 26.72 | 6200 | 0.5275 | 0.8839 | 0.8916 | 0.8839 | 0.8827 | 0.9181 |
0.6347 | 26.94 | 6250 | 0.5026 | 0.8907 | 0.8991 | 0.8907 | 0.8898 | 0.9219 |
0.6361 | 27.16 | 6300 | 0.5003 | 0.8988 | 0.9076 | 0.8988 | 0.8984 | 0.9275 |
0.6361 | 27.37 | 6350 | 0.4777 | 0.8988 | 0.9069 | 0.8988 | 0.8984 | 0.9275 |
0.6361 | 27.59 | 6400 | 0.4904 | 0.8988 | 0.9079 | 0.8988 | 0.8986 | 0.9275 |
0.6361 | 27.8 | 6450 | 0.4885 | 0.9001 | 0.9084 | 0.9001 | 0.8998 | 0.9285 |
0.631 | 28.02 | 6500 | 0.5134 | 0.8893 | 0.8973 | 0.8893 | 0.8882 | 0.9209 |
0.631 | 28.23 | 6550 | 0.5128 | 0.8920 | 0.9011 | 0.8920 | 0.8916 | 0.9232 |
0.631 | 28.45 | 6600 | 0.5136 | 0.8947 | 0.9032 | 0.8947 | 0.8942 | 0.9251 |
0.631 | 28.66 | 6650 | 0.5148 | 0.8907 | 0.8998 | 0.8907 | 0.8900 | 0.9219 |
0.631 | 28.88 | 6700 | 0.5143 | 0.8893 | 0.8971 | 0.8893 | 0.8883 | 0.9215 |
0.6104 | 29.09 | 6750 | 0.5237 | 0.8853 | 0.8952 | 0.8853 | 0.8844 | 0.9181 |
0.6104 | 29.31 | 6800 | 0.5187 | 0.8880 | 0.8976 | 0.8880 | 0.8873 | 0.9200 |
0.6104 | 29.53 | 6850 | 0.5183 | 0.8866 | 0.8964 | 0.8866 | 0.8860 | 0.9190 |
0.6104 | 29.74 | 6900 | 0.5172 | 0.8907 | 0.9006 | 0.8907 | 0.8899 | 0.9219 |
0.6104 | 29.96 | 6950 | 0.5141 | 0.8907 | 0.9001 | 0.8907 | 0.8902 | 0.9219 |
Framework versions
- Transformers 4.38.2
- Pytorch 2.3.0
- Datasets 2.19.1
- Tokenizers 0.15.1
- Downloads last month
- 10
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for fydhfzh/hubert-classifier-aug-fold-3-old
Base model
facebook/hubert-base-ls960