g8a9 commited on
Commit
1f92f94
1 Parent(s): 7e05d16

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +11 -9
README.md CHANGED
@@ -14,10 +14,10 @@ should probably proofread and complete it, then remove this comment. -->
14
 
15
  # vilt-b32-mlm-mami
16
 
17
- This model is a fine-tuned version of [dandelin/vilt-b32-mlm](https://huggingface.co/dandelin/vilt-b32-mlm) on the None dataset.
18
  It achieves the following results on the evaluation set:
19
- - Loss: 0.8414
20
- - F1: 0.7822
21
 
22
  ## Model description
23
 
@@ -41,7 +41,7 @@ The following hyperparameters were used during training:
41
  - eval_batch_size: 16
42
  - seed: 42
43
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
44
- - lr_scheduler_type: constant_with_warmup
45
  - lr_scheduler_warmup_ratio: 0.1
46
  - num_epochs: 10.0
47
 
@@ -49,11 +49,13 @@ The following hyperparameters were used during training:
49
 
50
  | Training Loss | Epoch | Step | Validation Loss | F1 |
51
  |:-------------:|:-----:|:----:|:---------------:|:------:|
52
- | 0.5835 | 0.96 | 200 | 0.5061 | 0.7544 |
53
- | 0.4219 | 1.91 | 400 | 0.4394 | 0.7948 |
54
- | 0.2613 | 2.87 | 600 | 0.4886 | 0.7897 |
55
- | 0.1399 | 3.83 | 800 | 0.6446 | 0.7892 |
56
- | 0.0707 | 4.78 | 1000 | 0.8414 | 0.7822 |
 
 
57
 
58
 
59
  ### Framework versions
 
14
 
15
  # vilt-b32-mlm-mami
16
 
17
+ This model is a fine-tuned version of [dandelin/vilt-b32-mlm](https://huggingface.co/dandelin/vilt-b32-mlm) on the MAMI dataset.
18
  It achieves the following results on the evaluation set:
19
+ - Loss: 0.6324
20
+ - F1: 0.7900
21
 
22
  ## Model description
23
 
 
41
  - eval_batch_size: 16
42
  - seed: 42
43
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
44
+ - lr_scheduler_type: reduce_lr_on_plateau
45
  - lr_scheduler_warmup_ratio: 0.1
46
  - num_epochs: 10.0
47
 
 
49
 
50
  | Training Loss | Epoch | Step | Validation Loss | F1 |
51
  |:-------------:|:-----:|:----:|:---------------:|:------:|
52
+ | 0.6261 | 0.48 | 100 | 0.5528 | 0.7149 |
53
+ | 0.4739 | 0.96 | 200 | 0.4688 | 0.7795 |
54
+ | 0.3717 | 1.44 | 300 | 0.4479 | 0.7948 |
55
+ | 0.3705 | 1.91 | 400 | 0.4385 | 0.7924 |
56
+ | 0.253 | 2.39 | 500 | 0.5210 | 0.7956 |
57
+ | 0.2095 | 2.87 | 600 | 0.5216 | 0.7965 |
58
+ | 0.1371 | 3.35 | 700 | 0.6324 | 0.7900 |
59
 
60
 
61
  ### Framework versions