gossminn commited on
Commit
0b7dbf7
1 Parent(s): df95eaa

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +96 -0
README.md ADDED
@@ -0,0 +1,96 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ tags:
4
+ - generated_from_trainer
5
+ model-index:
6
+ - name: predict-perception-xlmr-blame-none
7
+ results: []
8
+ ---
9
+
10
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
11
+ should probably proofread and complete it, then remove this comment. -->
12
+
13
+ # predict-perception-xlmr-blame-none
14
+
15
+ This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on an unknown dataset.
16
+ It achieves the following results on the evaluation set:
17
+ - Loss: 0.8941
18
+ - Rmse: 1.1259
19
+ - Rmse Blame::a Nessuno: 1.1259
20
+ - Mae: 0.8559
21
+ - Mae Blame::a Nessuno: 0.8559
22
+ - R2: 0.2847
23
+ - R2 Blame::a Nessuno: 0.2847
24
+ - Cos: 0.3043
25
+ - Pair: 0.0
26
+ - Rank: 0.5
27
+ - Neighbors: 0.3537
28
+ - Rsa: nan
29
+
30
+ ## Model description
31
+
32
+ More information needed
33
+
34
+ ## Intended uses & limitations
35
+
36
+ More information needed
37
+
38
+ ## Training and evaluation data
39
+
40
+ More information needed
41
+
42
+ ## Training procedure
43
+
44
+ ### Training hyperparameters
45
+
46
+ The following hyperparameters were used during training:
47
+ - learning_rate: 1e-05
48
+ - train_batch_size: 20
49
+ - eval_batch_size: 8
50
+ - seed: 1996
51
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
52
+ - lr_scheduler_type: linear
53
+ - num_epochs: 30
54
+
55
+ ### Training results
56
+
57
+ | Training Loss | Epoch | Step | Validation Loss | Rmse | Rmse Blame::a Nessuno | Mae | Mae Blame::a Nessuno | R2 | R2 Blame::a Nessuno | Cos | Pair | Rank | Neighbors | Rsa |
58
+ |:-------------:|:-----:|:----:|:---------------:|:------:|:---------------------:|:------:|:--------------------:|:-------:|:-------------------:|:-------:|:----:|:----:|:---------:|:---:|
59
+ | 1.042 | 1.0 | 15 | 1.2746 | 1.3443 | 1.3443 | 1.1788 | 1.1788 | -0.0197 | -0.0197 | 0.0435 | 0.0 | 0.5 | 0.2970 | nan |
60
+ | 0.9994 | 2.0 | 30 | 1.3264 | 1.3714 | 1.3714 | 1.1967 | 1.1967 | -0.0612 | -0.0612 | -0.0435 | 0.0 | 0.5 | 0.2961 | nan |
61
+ | 0.9123 | 3.0 | 45 | 1.2511 | 1.3319 | 1.3319 | 1.0932 | 1.0932 | -0.0009 | -0.0009 | 0.1304 | 0.0 | 0.5 | 0.2681 | nan |
62
+ | 0.741 | 4.0 | 60 | 1.0204 | 1.2028 | 1.2028 | 0.9818 | 0.9818 | 0.1836 | 0.1836 | 0.3043 | 0.0 | 0.5 | 0.3686 | nan |
63
+ | 0.6337 | 5.0 | 75 | 0.8607 | 1.1047 | 1.1047 | 0.8145 | 0.8145 | 0.3115 | 0.3115 | 0.3913 | 0.0 | 0.5 | 0.4044 | nan |
64
+ | 0.4974 | 6.0 | 90 | 0.8574 | 1.1026 | 1.1026 | 0.8095 | 0.8095 | 0.3140 | 0.3140 | 0.3913 | 0.0 | 0.5 | 0.4044 | nan |
65
+ | 0.4929 | 7.0 | 105 | 0.8548 | 1.1009 | 1.1009 | 0.8560 | 0.8560 | 0.3161 | 0.3161 | 0.3043 | 0.0 | 0.5 | 0.3686 | nan |
66
+ | 0.4378 | 8.0 | 120 | 0.6974 | 0.9944 | 0.9944 | 0.7503 | 0.7503 | 0.4421 | 0.4421 | 0.3043 | 0.0 | 0.5 | 0.3686 | nan |
67
+ | 0.3999 | 9.0 | 135 | 0.7955 | 1.0620 | 1.0620 | 0.7907 | 0.7907 | 0.3636 | 0.3636 | 0.3913 | 0.0 | 0.5 | 0.4044 | nan |
68
+ | 0.3715 | 10.0 | 150 | 0.8954 | 1.1267 | 1.1267 | 0.8036 | 0.8036 | 0.2837 | 0.2837 | 0.4783 | 0.0 | 0.5 | 0.4058 | nan |
69
+ | 0.3551 | 11.0 | 165 | 0.8449 | 1.0945 | 1.0945 | 0.8748 | 0.8748 | 0.3241 | 0.3241 | 0.3913 | 0.0 | 0.5 | 0.3931 | nan |
70
+ | 0.3428 | 12.0 | 180 | 0.7960 | 1.0624 | 1.0624 | 0.8000 | 0.8000 | 0.3632 | 0.3632 | 0.3913 | 0.0 | 0.5 | 0.4044 | nan |
71
+ | 0.2923 | 13.0 | 195 | 0.9027 | 1.1313 | 1.1313 | 0.8441 | 0.8441 | 0.2778 | 0.2778 | 0.3043 | 0.0 | 0.5 | 0.3537 | nan |
72
+ | 0.2236 | 14.0 | 210 | 0.8914 | 1.1242 | 1.1242 | 0.8998 | 0.8998 | 0.2869 | 0.2869 | 0.2174 | 0.0 | 0.5 | 0.3324 | nan |
73
+ | 0.2553 | 15.0 | 225 | 0.9184 | 1.1411 | 1.1411 | 0.8633 | 0.8633 | 0.2652 | 0.2652 | 0.3043 | 0.0 | 0.5 | 0.3537 | nan |
74
+ | 0.2064 | 16.0 | 240 | 0.9284 | 1.1473 | 1.1473 | 0.8919 | 0.8919 | 0.2573 | 0.2573 | 0.3043 | 0.0 | 0.5 | 0.3537 | nan |
75
+ | 0.1972 | 17.0 | 255 | 0.9495 | 1.1602 | 1.1602 | 0.8768 | 0.8768 | 0.2404 | 0.2404 | 0.3043 | 0.0 | 0.5 | 0.3537 | nan |
76
+ | 0.1622 | 18.0 | 270 | 0.9850 | 1.1818 | 1.1818 | 0.9303 | 0.9303 | 0.2120 | 0.2120 | 0.2174 | 0.0 | 0.5 | 0.3324 | nan |
77
+ | 0.1685 | 19.0 | 285 | 0.9603 | 1.1669 | 1.1669 | 0.8679 | 0.8679 | 0.2317 | 0.2317 | 0.3043 | 0.0 | 0.5 | 0.3537 | nan |
78
+ | 0.1773 | 20.0 | 300 | 0.9269 | 1.1464 | 1.1464 | 0.8391 | 0.8391 | 0.2585 | 0.2585 | 0.3043 | 0.0 | 0.5 | 0.3537 | nan |
79
+ | 0.1716 | 21.0 | 315 | 0.8936 | 1.1256 | 1.1256 | 0.8357 | 0.8357 | 0.2851 | 0.2851 | 0.3043 | 0.0 | 0.5 | 0.3537 | nan |
80
+ | 0.161 | 22.0 | 330 | 0.8894 | 1.1230 | 1.1230 | 0.8593 | 0.8593 | 0.2884 | 0.2884 | 0.3043 | 0.0 | 0.5 | 0.3537 | nan |
81
+ | 0.1297 | 23.0 | 345 | 0.8997 | 1.1294 | 1.1294 | 0.8568 | 0.8568 | 0.2802 | 0.2802 | 0.3043 | 0.0 | 0.5 | 0.3537 | nan |
82
+ | 0.15 | 24.0 | 360 | 0.8748 | 1.1137 | 1.1137 | 0.8541 | 0.8541 | 0.3002 | 0.3002 | 0.2174 | 0.0 | 0.5 | 0.3324 | nan |
83
+ | 0.1149 | 25.0 | 375 | 0.9264 | 1.1461 | 1.1461 | 0.8682 | 0.8682 | 0.2588 | 0.2588 | 0.3913 | 0.0 | 0.5 | 0.3901 | nan |
84
+ | 0.1354 | 26.0 | 390 | 0.8829 | 1.1188 | 1.1188 | 0.8608 | 0.8608 | 0.2937 | 0.2937 | 0.2174 | 0.0 | 0.5 | 0.3324 | nan |
85
+ | 0.1321 | 27.0 | 405 | 0.9137 | 1.1382 | 1.1382 | 0.8656 | 0.8656 | 0.2691 | 0.2691 | 0.3043 | 0.0 | 0.5 | 0.3537 | nan |
86
+ | 0.1154 | 28.0 | 420 | 0.8774 | 1.1154 | 1.1154 | 0.8488 | 0.8488 | 0.2980 | 0.2980 | 0.2174 | 0.0 | 0.5 | 0.3324 | nan |
87
+ | 0.1112 | 29.0 | 435 | 0.8985 | 1.1287 | 1.1287 | 0.8562 | 0.8562 | 0.2812 | 0.2812 | 0.3043 | 0.0 | 0.5 | 0.3537 | nan |
88
+ | 0.1525 | 30.0 | 450 | 0.8941 | 1.1259 | 1.1259 | 0.8559 | 0.8559 | 0.2847 | 0.2847 | 0.3043 | 0.0 | 0.5 | 0.3537 | nan |
89
+
90
+
91
+ ### Framework versions
92
+
93
+ - Transformers 4.16.2
94
+ - Pytorch 1.10.2+cu113
95
+ - Datasets 1.18.3
96
+ - Tokenizers 0.11.0