tthhanh commited on
Commit
16a9458
·
verified ·
1 Parent(s): 0cbcf9b

End of training

Browse files
Files changed (2) hide show
  1. README.md +67 -21
  2. adapter_model.safetensors +1 -1
README.md CHANGED
@@ -16,27 +16,27 @@ should probably proofread and complete it, then remove this comment. -->
16
 
17
  This model is a fine-tuned version of [mistralai/Mistral-7B-v0.3](https://huggingface.co/mistralai/Mistral-7B-v0.3) on an unknown dataset.
18
  It achieves the following results on the evaluation set:
19
- - Loss: 0.6737
20
- - Law Precision: 0.8
21
- - Law Recall: 0.8649
22
- - Law F1: 0.8312
23
  - Law Number: 74
24
- - Violated by Precision: 0.6818
25
  - Violated by Recall: 0.8219
26
- - Violated by F1: 0.7453
27
  - Violated by Number: 73
28
- - Violated on Precision: 0.3766
29
  - Violated on Recall: 0.5273
30
- - Violated on F1: 0.4394
31
  - Violated on Number: 55
32
- - Violation Precision: 0.4560
33
- - Violation Recall: 0.6290
34
- - Violation F1: 0.5287
35
  - Violation Number: 601
36
- - Overall Precision: 0.4944
37
- - Overall Recall: 0.6613
38
- - Overall F1: 0.5658
39
- - Overall Accuracy: 0.9406
40
 
41
  ## Model description
42
 
@@ -65,12 +65,58 @@ The following hyperparameters were used during training:
65
 
66
  ### Training results
67
 
68
- | Training Loss | Epoch | Step | Validation Loss | Law Precision | Law Recall | Law F1 | Law Number | Violated by Precision | Violated by Recall | Violated by F1 | Violated by Number | Violated on Precision | Violated on Recall | Violated on F1 | Violated on Number | Violation Precision | Violation Recall | Violation F1 | Violation Number | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
69
- |:-------------:|:-------:|:----:|:---------------:|:-------------:|:----------:|:------:|:----------:|:---------------------:|:------------------:|:--------------:|:------------------:|:---------------------:|:------------------:|:--------------:|:------------------:|:-------------------:|:----------------:|:------------:|:----------------:|:-----------------:|:--------------:|:----------:|:----------------:|
70
- | 0.254 | 11.1111 | 500 | 0.5374 | 0.8133 | 0.8243 | 0.8188 | 74 | 0.7037 | 0.7808 | 0.7403 | 73 | 0.4915 | 0.5273 | 0.5088 | 55 | 0.3979 | 0.5092 | 0.4467 | 601 | 0.4604 | 0.5641 | 0.5070 | 0.9320 |
71
- | 0.0035 | 22.2222 | 1000 | 0.6315 | 0.7701 | 0.9054 | 0.8323 | 74 | 0.6860 | 0.8082 | 0.7421 | 73 | 0.2933 | 0.4 | 0.3385 | 55 | 0.4363 | 0.6156 | 0.5107 | 601 | 0.4726 | 0.6451 | 0.5456 | 0.9382 |
72
- | 0.0064 | 33.3333 | 1500 | 0.7000 | 0.6633 | 0.8784 | 0.7558 | 74 | 0.6458 | 0.8493 | 0.7337 | 73 | 0.3021 | 0.5273 | 0.3841 | 55 | 0.4504 | 0.6273 | 0.5243 | 601 | 0.4729 | 0.6638 | 0.5523 | 0.9375 |
73
- | 0.0013 | 44.4444 | 2000 | 0.6737 | 0.8 | 0.8649 | 0.8312 | 74 | 0.6818 | 0.8219 | 0.7453 | 73 | 0.3766 | 0.5273 | 0.4394 | 55 | 0.4560 | 0.6290 | 0.5287 | 601 | 0.4944 | 0.6613 | 0.5658 | 0.9406 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
74
 
75
 
76
  ### Framework versions
 
16
 
17
  This model is a fine-tuned version of [mistralai/Mistral-7B-v0.3](https://huggingface.co/mistralai/Mistral-7B-v0.3) on an unknown dataset.
18
  It achieves the following results on the evaluation set:
19
+ - Loss: 0.6841
20
+ - Law Precision: 0.7927
21
+ - Law Recall: 0.8784
22
+ - Law F1: 0.8333
23
  - Law Number: 74
24
+ - Violated by Precision: 0.6593
25
  - Violated by Recall: 0.8219
26
+ - Violated by F1: 0.7317
27
  - Violated by Number: 73
28
+ - Violated on Precision: 0.3919
29
  - Violated on Recall: 0.5273
30
+ - Violated on F1: 0.4496
31
  - Violated on Number: 55
32
+ - Violation Precision: 0.4662
33
+ - Violation Recall: 0.6190
34
+ - Violation F1: 0.5318
35
  - Violation Number: 601
36
+ - Overall Precision: 0.5033
37
+ - Overall Recall: 0.6550
38
+ - Overall F1: 0.5693
39
+ - Overall Accuracy: 0.9409
40
 
41
  ## Model description
42
 
 
65
 
66
  ### Training results
67
 
68
+ | Training Loss | Epoch | Step | Validation Loss | Law Precision | Law Recall | Law F1 | Law Number | Violated by Precision | Violated by Recall | Violated by F1 | Violated by Number | Violated on Precision | Violated on Recall | Violated on F1 | Violated on Number | Violation Precision | Violation Recall | Violation F1 | Violation Number | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
69
+ |:-------------:|:-----:|:----:|:---------------:|:-------------:|:----------:|:------:|:----------:|:---------------------:|:------------------:|:--------------:|:------------------:|:---------------------:|:------------------:|:--------------:|:------------------:|:-------------------:|:----------------:|:------------:|:----------------:|:-----------------:|:--------------:|:----------:|:----------------:|
70
+ | No log | 1.0 | 45 | 0.3496 | 0.2973 | 0.1486 | 0.1982 | 74 | 0.0 | 0.0 | 0.0 | 73 | 0.0 | 0.0 | 0.0 | 55 | 0.1484 | 0.2696 | 0.1914 | 601 | 0.1532 | 0.2154 | 0.1791 | 0.8813 |
71
+ | No log | 2.0 | 90 | 0.2812 | 0.3594 | 0.6216 | 0.4554 | 74 | 0.4271 | 0.5616 | 0.4852 | 73 | 0.1538 | 0.1091 | 0.1277 | 55 | 0.2941 | 0.4243 | 0.3474 | 601 | 0.3080 | 0.4334 | 0.3601 | 0.9133 |
72
+ | No log | 3.0 | 135 | 0.2398 | 0.5536 | 0.8378 | 0.6667 | 74 | 0.4234 | 0.7945 | 0.5524 | 73 | 0.1405 | 0.3091 | 0.1932 | 55 | 0.3086 | 0.4992 | 0.3814 | 601 | 0.3256 | 0.5442 | 0.4075 | 0.9234 |
73
+ | No log | 4.0 | 180 | 0.2495 | 0.6667 | 0.5676 | 0.6131 | 74 | 0.5672 | 0.5205 | 0.5429 | 73 | 0.2286 | 0.1455 | 0.1778 | 55 | 0.4 | 0.5391 | 0.4592 | 601 | 0.4226 | 0.5131 | 0.4634 | 0.9320 |
74
+ | No log | 5.0 | 225 | 0.2947 | 0.66 | 0.8919 | 0.7586 | 74 | 0.4627 | 0.8493 | 0.5990 | 73 | 0.3011 | 0.5091 | 0.3784 | 55 | 0.3602 | 0.5724 | 0.4422 | 601 | 0.3900 | 0.6227 | 0.4796 | 0.9286 |
75
+ | No log | 6.0 | 270 | 0.3395 | 0.8026 | 0.8243 | 0.8133 | 74 | 0.5833 | 0.8630 | 0.6961 | 73 | 0.2584 | 0.4182 | 0.3194 | 55 | 0.4035 | 0.5774 | 0.4750 | 601 | 0.4360 | 0.6152 | 0.5103 | 0.9352 |
76
+ | No log | 7.0 | 315 | 0.3844 | 0.4793 | 0.7838 | 0.5949 | 74 | 0.4122 | 0.8356 | 0.5520 | 73 | 0.3165 | 0.4545 | 0.3731 | 55 | 0.3769 | 0.5757 | 0.4556 | 601 | 0.3870 | 0.6102 | 0.4737 | 0.9336 |
77
+ | No log | 8.0 | 360 | 0.3227 | 0.6854 | 0.8243 | 0.7485 | 74 | 0.5424 | 0.8767 | 0.6702 | 73 | 0.2385 | 0.4727 | 0.3171 | 55 | 0.4169 | 0.6007 | 0.4922 | 601 | 0.4332 | 0.6376 | 0.5159 | 0.9380 |
78
+ | No log | 9.0 | 405 | 0.5972 | 0.8205 | 0.8649 | 0.8421 | 74 | 0.6145 | 0.6986 | 0.6538 | 73 | 0.3548 | 0.4 | 0.3761 | 55 | 0.3922 | 0.6506 | 0.4894 | 601 | 0.4328 | 0.6575 | 0.5220 | 0.9258 |
79
+ | No log | 10.0 | 450 | 0.4920 | 0.6630 | 0.8243 | 0.7349 | 74 | 0.6465 | 0.8767 | 0.7442 | 73 | 0.3636 | 0.5091 | 0.4242 | 55 | 0.4229 | 0.5524 | 0.4791 | 601 | 0.4606 | 0.6040 | 0.5226 | 0.9357 |
80
+ | No log | 11.0 | 495 | 0.4504 | 0.7625 | 0.8243 | 0.7922 | 74 | 0.7093 | 0.8356 | 0.7673 | 73 | 0.5085 | 0.5455 | 0.5263 | 55 | 0.4569 | 0.6173 | 0.5251 | 601 | 0.5043 | 0.6513 | 0.5685 | 0.9386 |
81
+ | 0.254 | 12.0 | 540 | 0.6719 | 0.7356 | 0.8649 | 0.7950 | 74 | 0.625 | 0.8219 | 0.7101 | 73 | 0.4265 | 0.5273 | 0.4715 | 55 | 0.4284 | 0.6323 | 0.5108 | 601 | 0.4684 | 0.6638 | 0.5492 | 0.9336 |
82
+ | 0.254 | 13.0 | 585 | 0.4562 | 0.7241 | 0.8514 | 0.7826 | 74 | 0.5299 | 0.8493 | 0.6526 | 73 | 0.3973 | 0.5273 | 0.4531 | 55 | 0.4171 | 0.5691 | 0.4814 | 601 | 0.4521 | 0.6177 | 0.5221 | 0.9364 |
83
+ | 0.254 | 14.0 | 630 | 0.5342 | 0.7765 | 0.8919 | 0.8302 | 74 | 0.7284 | 0.8082 | 0.7662 | 73 | 0.4308 | 0.5091 | 0.4667 | 55 | 0.4410 | 0.6156 | 0.5139 | 601 | 0.4888 | 0.6513 | 0.5585 | 0.9393 |
84
+ | 0.254 | 15.0 | 675 | 0.5533 | 0.8158 | 0.8378 | 0.8267 | 74 | 0.6835 | 0.7397 | 0.7105 | 73 | 0.3594 | 0.4182 | 0.3866 | 55 | 0.4348 | 0.5990 | 0.5038 | 601 | 0.4766 | 0.6214 | 0.5395 | 0.9388 |
85
+ | 0.254 | 16.0 | 720 | 0.5893 | 0.7191 | 0.8649 | 0.7853 | 74 | 0.6977 | 0.8219 | 0.7547 | 73 | 0.3571 | 0.4545 | 0.4 | 55 | 0.4411 | 0.6106 | 0.5122 | 601 | 0.4791 | 0.6426 | 0.5489 | 0.9394 |
86
+ | 0.254 | 17.0 | 765 | 0.5400 | 0.8182 | 0.8514 | 0.8344 | 74 | 0.7089 | 0.7671 | 0.7368 | 73 | 0.3448 | 0.3636 | 0.3540 | 55 | 0.4509 | 0.6106 | 0.5187 | 601 | 0.4922 | 0.6301 | 0.5527 | 0.9409 |
87
+ | 0.254 | 18.0 | 810 | 0.6109 | 0.7033 | 0.8649 | 0.7758 | 74 | 0.6778 | 0.8356 | 0.7485 | 73 | 0.4462 | 0.5273 | 0.4833 | 55 | 0.4399 | 0.6090 | 0.5108 | 601 | 0.4824 | 0.6476 | 0.5529 | 0.9395 |
88
+ | 0.254 | 19.0 | 855 | 0.5980 | 0.8442 | 0.8784 | 0.8609 | 74 | 0.6867 | 0.7808 | 0.7308 | 73 | 0.3538 | 0.4182 | 0.3833 | 55 | 0.4530 | 0.6090 | 0.5195 | 601 | 0.4947 | 0.6364 | 0.5566 | 0.9406 |
89
+ | 0.254 | 20.0 | 900 | 0.6257 | 0.7711 | 0.8649 | 0.8153 | 74 | 0.6593 | 0.8219 | 0.7317 | 73 | 0.3247 | 0.4545 | 0.3788 | 55 | 0.4466 | 0.6190 | 0.5188 | 601 | 0.4806 | 0.6488 | 0.5522 | 0.9397 |
90
+ | 0.254 | 21.0 | 945 | 0.6424 | 0.8077 | 0.8514 | 0.8289 | 74 | 0.6829 | 0.7671 | 0.7226 | 73 | 0.4694 | 0.4182 | 0.4423 | 55 | 0.4543 | 0.6123 | 0.5216 | 601 | 0.5005 | 0.6351 | 0.5598 | 0.9402 |
91
+ | 0.254 | 22.0 | 990 | 0.6328 | 0.7442 | 0.8649 | 0.8 | 74 | 0.7024 | 0.8082 | 0.7516 | 73 | 0.4032 | 0.4545 | 0.4274 | 55 | 0.4465 | 0.6173 | 0.5182 | 601 | 0.4882 | 0.6463 | 0.5563 | 0.9389 |
92
+ | 0.0035 | 23.0 | 1035 | 0.5990 | 0.7529 | 0.8649 | 0.8050 | 74 | 0.6988 | 0.7945 | 0.7436 | 73 | 0.3580 | 0.5273 | 0.4265 | 55 | 0.4352 | 0.6256 | 0.5133 | 601 | 0.4735 | 0.6563 | 0.5501 | 0.9371 |
93
+ | 0.0035 | 24.0 | 1080 | 0.5796 | 0.8228 | 0.8784 | 0.8497 | 74 | 0.7011 | 0.8356 | 0.7625 | 73 | 0.4054 | 0.5455 | 0.4651 | 55 | 0.4665 | 0.6140 | 0.5302 | 601 | 0.5092 | 0.6538 | 0.5725 | 0.9408 |
94
+ | 0.0035 | 25.0 | 1125 | 0.6002 | 0.8125 | 0.8784 | 0.8442 | 74 | 0.7143 | 0.8219 | 0.7643 | 73 | 0.4058 | 0.5091 | 0.4516 | 55 | 0.4672 | 0.6273 | 0.5355 | 601 | 0.5096 | 0.6600 | 0.5751 | 0.9410 |
95
+ | 0.0035 | 26.0 | 1170 | 0.5872 | 0.7927 | 0.8784 | 0.8333 | 74 | 0.7024 | 0.8082 | 0.7516 | 73 | 0.3919 | 0.5273 | 0.4496 | 55 | 0.4608 | 0.6156 | 0.5271 | 601 | 0.5014 | 0.6513 | 0.5666 | 0.9407 |
96
+ | 0.0035 | 27.0 | 1215 | 0.6015 | 0.8025 | 0.8784 | 0.8387 | 74 | 0.7108 | 0.8082 | 0.7564 | 73 | 0.4154 | 0.4909 | 0.45 | 55 | 0.4538 | 0.6123 | 0.5212 | 601 | 0.4990 | 0.6463 | 0.5632 | 0.9408 |
97
+ | 0.0035 | 28.0 | 1260 | 0.6268 | 0.8049 | 0.8919 | 0.8462 | 74 | 0.7059 | 0.8219 | 0.7595 | 73 | 0.4110 | 0.5455 | 0.4688 | 55 | 0.4641 | 0.6240 | 0.5323 | 601 | 0.5067 | 0.6613 | 0.5737 | 0.9410 |
98
+ | 0.0035 | 29.0 | 1305 | 0.6410 | 0.8049 | 0.8919 | 0.8462 | 74 | 0.7143 | 0.8219 | 0.7643 | 73 | 0.4085 | 0.5273 | 0.4603 | 55 | 0.4682 | 0.6240 | 0.5350 | 601 | 0.5106 | 0.6600 | 0.5758 | 0.9412 |
99
+ | 0.0035 | 30.0 | 1350 | 0.6460 | 0.7976 | 0.9054 | 0.8481 | 74 | 0.6383 | 0.8219 | 0.7186 | 73 | 0.3816 | 0.5273 | 0.4427 | 55 | 0.4478 | 0.6206 | 0.5202 | 601 | 0.4867 | 0.6588 | 0.5598 | 0.9395 |
100
+ | 0.0035 | 31.0 | 1395 | 0.6601 | 0.7952 | 0.8919 | 0.8408 | 74 | 0.6742 | 0.8219 | 0.7407 | 73 | 0.3827 | 0.5636 | 0.4559 | 55 | 0.4412 | 0.5990 | 0.5081 | 601 | 0.4836 | 0.6438 | 0.5524 | 0.9387 |
101
+ | 0.0035 | 32.0 | 1440 | 0.6641 | 0.8025 | 0.8784 | 0.8387 | 74 | 0.6667 | 0.8219 | 0.7362 | 73 | 0.3827 | 0.5636 | 0.4559 | 55 | 0.4410 | 0.5973 | 0.5074 | 601 | 0.4831 | 0.6413 | 0.5511 | 0.9382 |
102
+ | 0.0035 | 33.0 | 1485 | 0.6745 | 0.7586 | 0.8919 | 0.8199 | 74 | 0.6667 | 0.8219 | 0.7362 | 73 | 0.3827 | 0.5636 | 0.4559 | 55 | 0.4490 | 0.6073 | 0.5163 | 601 | 0.4874 | 0.6501 | 0.5571 | 0.9393 |
103
+ | 0.0064 | 34.0 | 1530 | 0.7091 | 0.7805 | 0.8649 | 0.8205 | 74 | 0.6552 | 0.7808 | 0.7125 | 73 | 0.3974 | 0.5636 | 0.4662 | 55 | 0.4606 | 0.6223 | 0.5294 | 601 | 0.4967 | 0.6550 | 0.5650 | 0.9398 |
104
+ | 0.0064 | 35.0 | 1575 | 0.7150 | 0.7778 | 0.8514 | 0.8129 | 74 | 0.6552 | 0.7808 | 0.7125 | 73 | 0.4133 | 0.5636 | 0.4769 | 55 | 0.4681 | 0.6223 | 0.5343 | 601 | 0.5038 | 0.6538 | 0.5691 | 0.9399 |
105
+ | 0.0064 | 36.0 | 1620 | 0.7147 | 0.7901 | 0.8649 | 0.8258 | 74 | 0.6477 | 0.7808 | 0.7081 | 73 | 0.3974 | 0.5636 | 0.4662 | 55 | 0.4669 | 0.6223 | 0.5335 | 601 | 0.5019 | 0.6550 | 0.5683 | 0.9401 |
106
+ | 0.0064 | 37.0 | 1665 | 0.7586 | 0.8356 | 0.8243 | 0.8299 | 74 | 0.6951 | 0.7808 | 0.7355 | 73 | 0.4054 | 0.5455 | 0.4651 | 55 | 0.4705 | 0.6240 | 0.5365 | 601 | 0.5097 | 0.6513 | 0.5719 | 0.9408 |
107
+ | 0.0064 | 38.0 | 1710 | 0.6612 | 0.8182 | 0.8514 | 0.8344 | 74 | 0.7229 | 0.8219 | 0.7692 | 73 | 0.3846 | 0.5455 | 0.4511 | 55 | 0.4600 | 0.6323 | 0.5326 | 601 | 0.5009 | 0.6638 | 0.5710 | 0.9383 |
108
+ | 0.0064 | 39.0 | 1755 | 0.6656 | 0.8052 | 0.8378 | 0.8212 | 74 | 0.6667 | 0.7945 | 0.725 | 73 | 0.3816 | 0.5273 | 0.4427 | 55 | 0.4496 | 0.6007 | 0.5142 | 601 | 0.4890 | 0.6351 | 0.5525 | 0.9395 |
109
+ | 0.0064 | 40.0 | 1800 | 0.7028 | 0.8 | 0.8649 | 0.8312 | 74 | 0.6932 | 0.8356 | 0.7578 | 73 | 0.3671 | 0.5273 | 0.4328 | 55 | 0.4507 | 0.6389 | 0.5286 | 601 | 0.4895 | 0.6700 | 0.5657 | 0.9373 |
110
+ | 0.0064 | 41.0 | 1845 | 0.6765 | 0.8205 | 0.8649 | 0.8421 | 74 | 0.6977 | 0.8219 | 0.7547 | 73 | 0.3671 | 0.5273 | 0.4328 | 55 | 0.4537 | 0.5957 | 0.5151 | 601 | 0.4952 | 0.6364 | 0.5569 | 0.9396 |
111
+ | 0.0064 | 42.0 | 1890 | 0.6769 | 0.8205 | 0.8649 | 0.8421 | 74 | 0.6824 | 0.7945 | 0.7342 | 73 | 0.3919 | 0.5273 | 0.4496 | 55 | 0.4652 | 0.6123 | 0.5287 | 601 | 0.5049 | 0.6463 | 0.5669 | 0.9410 |
112
+ | 0.0064 | 43.0 | 1935 | 0.6826 | 0.8205 | 0.8649 | 0.8421 | 74 | 0.6941 | 0.8082 | 0.7468 | 73 | 0.3590 | 0.5091 | 0.4211 | 55 | 0.4696 | 0.6173 | 0.5334 | 601 | 0.5063 | 0.6501 | 0.5692 | 0.9410 |
113
+ | 0.0064 | 44.0 | 1980 | 0.6828 | 0.8205 | 0.8649 | 0.8421 | 74 | 0.6824 | 0.7945 | 0.7342 | 73 | 0.3733 | 0.5091 | 0.4308 | 55 | 0.4662 | 0.6190 | 0.5318 | 601 | 0.5039 | 0.6501 | 0.5677 | 0.9411 |
114
+ | 0.0013 | 45.0 | 2025 | 0.6757 | 0.8125 | 0.8784 | 0.8442 | 74 | 0.6897 | 0.8219 | 0.75 | 73 | 0.3816 | 0.5273 | 0.4427 | 55 | 0.4583 | 0.6306 | 0.5308 | 601 | 0.4981 | 0.6638 | 0.5691 | 0.9405 |
115
+ | 0.0013 | 46.0 | 2070 | 0.6777 | 0.7901 | 0.8649 | 0.8258 | 74 | 0.6705 | 0.8082 | 0.7329 | 73 | 0.3590 | 0.5091 | 0.4211 | 55 | 0.4612 | 0.6323 | 0.5333 | 601 | 0.4958 | 0.6613 | 0.5667 | 0.9405 |
116
+ | 0.0013 | 47.0 | 2115 | 0.6801 | 0.8 | 0.8649 | 0.8312 | 74 | 0.6818 | 0.8219 | 0.7453 | 73 | 0.3816 | 0.5273 | 0.4427 | 55 | 0.4590 | 0.6339 | 0.5325 | 601 | 0.4972 | 0.6650 | 0.5690 | 0.9407 |
117
+ | 0.0013 | 48.0 | 2160 | 0.6828 | 0.7857 | 0.8919 | 0.8354 | 74 | 0.6593 | 0.8219 | 0.7317 | 73 | 0.3896 | 0.5455 | 0.4545 | 55 | 0.4694 | 0.6256 | 0.5364 | 601 | 0.5052 | 0.6625 | 0.5733 | 0.9408 |
118
+ | 0.0013 | 49.0 | 2205 | 0.6836 | 0.7927 | 0.8784 | 0.8333 | 74 | 0.6593 | 0.8219 | 0.7317 | 73 | 0.3919 | 0.5273 | 0.4496 | 55 | 0.4649 | 0.6173 | 0.5304 | 601 | 0.5024 | 0.6538 | 0.5682 | 0.9410 |
119
+ | 0.0013 | 50.0 | 2250 | 0.6841 | 0.7927 | 0.8784 | 0.8333 | 74 | 0.6593 | 0.8219 | 0.7317 | 73 | 0.3919 | 0.5273 | 0.4496 | 55 | 0.4662 | 0.6190 | 0.5318 | 601 | 0.5033 | 0.6550 | 0.5693 | 0.9409 |
120
 
121
 
122
  ### Framework versions
adapter_model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:13f6d5f70635b3b34a9a7a0740e38325f89fa5d91b4aa0caf55f6efe4442a79d
3
  size 54617130
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:21a503187cdc7e6d02f54207e670831be6aa155bea4d5c3c83e982e4e172496c
3
  size 54617130