Archana02 commited on
Commit
0866536
1 Parent(s): d1cc02d

End of training

Browse files
Files changed (1) hide show
  1. README.md +27 -31
README.md CHANGED
@@ -2,8 +2,6 @@
2
  base_model: microsoft/layoutlm-base-uncased
3
  tags:
4
  - generated_from_trainer
5
- datasets:
6
- - funsd
7
  model-index:
8
  - name: layoutlm-funsd
9
  results: []
@@ -14,16 +12,26 @@ should probably proofread and complete it, then remove this comment. -->
14
 
15
  # layoutlm-funsd
16
 
17
- This model is a fine-tuned version of [microsoft/layoutlm-base-uncased](https://huggingface.co/microsoft/layoutlm-base-uncased) on the funsd dataset.
18
  It achieves the following results on the evaluation set:
19
- - Loss: 0.7048
20
- - Answer: {'precision': 0.6989130434782609, 'recall': 0.7948084054388134, 'f1': 0.7437825332562175, 'number': 809}
21
- - Header: {'precision': 0.3049645390070922, 'recall': 0.36134453781512604, 'f1': 0.3307692307692308, 'number': 119}
22
- - Question: {'precision': 0.7769973661106233, 'recall': 0.8309859154929577, 'f1': 0.8030852994555354, 'number': 1065}
23
- - Overall Precision: 0.7141
24
- - Overall Recall: 0.7883
25
- - Overall F1: 0.7493
26
- - Overall Accuracy: 0.8041
 
 
 
 
 
 
 
 
 
 
27
 
28
  ## Model description
29
 
@@ -43,32 +51,20 @@ More information needed
43
 
44
  The following hyperparameters were used during training:
45
  - learning_rate: 3e-05
46
- - train_batch_size: 16
47
- - eval_batch_size: 8
48
  - seed: 42
49
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
50
  - lr_scheduler_type: linear
51
- - num_epochs: 15
52
 
53
  ### Training results
54
 
55
- | Training Loss | Epoch | Step | Validation Loss | Answer | Header | Question | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
56
- |:-------------:|:-----:|:----:|:---------------:|:------------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
57
- | 1.8962 | 1.0 | 10 | 1.6314 | {'precision': 0.04096170970614425, 'recall': 0.05686032138442522, 'f1': 0.047619047619047616, 'number': 809} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} | {'precision': 0.21477162293488825, 'recall': 0.20751173708920187, 'f1': 0.21107927411652339, 'number': 1065} | 0.1241 | 0.1340 | 0.1288 | 0.3878 |
58
- | 1.4949 | 2.0 | 20 | 1.2748 | {'precision': 0.16384915474642392, 'recall': 0.1557478368355995, 'f1': 0.1596958174904943, 'number': 809} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} | {'precision': 0.42642857142857143, 'recall': 0.5605633802816902, 'f1': 0.48438133874239353, 'number': 1065} | 0.3333 | 0.3628 | 0.3474 | 0.5644 |
59
- | 1.1275 | 3.0 | 30 | 0.9637 | {'precision': 0.4481236203090508, 'recall': 0.5018541409147095, 'f1': 0.473469387755102, 'number': 809} | {'precision': 0.030303030303030304, 'recall': 0.008403361344537815, 'f1': 0.013157894736842105, 'number': 119} | {'precision': 0.6073883161512027, 'recall': 0.6638497652582159, 'f1': 0.6343651861821444, 'number': 1065} | 0.5297 | 0.5590 | 0.5439 | 0.6927 |
60
- | 0.8515 | 4.0 | 40 | 0.8074 | {'precision': 0.5718446601941748, 'recall': 0.7280593325092707, 'f1': 0.6405655247417075, 'number': 809} | {'precision': 0.14492753623188406, 'recall': 0.08403361344537816, 'f1': 0.10638297872340426, 'number': 119} | {'precision': 0.6351132686084142, 'recall': 0.7370892018779343, 'f1': 0.6823120382442416, 'number': 1065} | 0.5927 | 0.6944 | 0.6396 | 0.7481 |
61
- | 0.6885 | 5.0 | 50 | 0.7250 | {'precision': 0.6376963350785341, 'recall': 0.7527812113720643, 'f1': 0.6904761904761906, 'number': 809} | {'precision': 0.23529411764705882, 'recall': 0.13445378151260504, 'f1': 0.17112299465240638, 'number': 119} | {'precision': 0.7158081705150977, 'recall': 0.7568075117370892, 'f1': 0.735737106344135, 'number': 1065} | 0.6659 | 0.7180 | 0.6910 | 0.7705 |
62
- | 0.582 | 6.0 | 60 | 0.6796 | {'precision': 0.6464323748668797, 'recall': 0.7503090234857849, 'f1': 0.694508009153318, 'number': 809} | {'precision': 0.2204724409448819, 'recall': 0.23529411764705882, 'f1': 0.22764227642276422, 'number': 119} | {'precision': 0.6873015873015873, 'recall': 0.8131455399061033, 'f1': 0.7449462365591398, 'number': 1065} | 0.6453 | 0.7531 | 0.6951 | 0.7920 |
63
- | 0.505 | 7.0 | 70 | 0.6522 | {'precision': 0.6307385229540918, 'recall': 0.7812113720642769, 'f1': 0.6979569298729985, 'number': 809} | {'precision': 0.21367521367521367, 'recall': 0.21008403361344538, 'f1': 0.211864406779661, 'number': 119} | {'precision': 0.7270450751252087, 'recall': 0.8178403755868544, 'f1': 0.7697746354396818, 'number': 1065} | 0.6595 | 0.7667 | 0.7090 | 0.7973 |
64
- | 0.4537 | 8.0 | 80 | 0.6537 | {'precision': 0.6717391304347826, 'recall': 0.7639060568603214, 'f1': 0.714864083285136, 'number': 809} | {'precision': 0.272, 'recall': 0.2857142857142857, 'f1': 0.27868852459016397, 'number': 119} | {'precision': 0.7433110367892977, 'recall': 0.8347417840375587, 'f1': 0.7863777089783283, 'number': 1065} | 0.6876 | 0.7732 | 0.7279 | 0.8010 |
65
- | 0.3975 | 9.0 | 90 | 0.6624 | {'precision': 0.6649269311064718, 'recall': 0.7873918417799752, 'f1': 0.7209960384833051, 'number': 809} | {'precision': 0.2676056338028169, 'recall': 0.31932773109243695, 'f1': 0.2911877394636015, 'number': 119} | {'precision': 0.7576285963382737, 'recall': 0.815962441314554, 'f1': 0.7857142857142857, 'number': 1065} | 0.6871 | 0.7747 | 0.7283 | 0.8007 |
66
- | 0.3619 | 10.0 | 100 | 0.6649 | {'precision': 0.6825053995680346, 'recall': 0.7812113720642769, 'f1': 0.7285302593659942, 'number': 809} | {'precision': 0.3178294573643411, 'recall': 0.3445378151260504, 'f1': 0.33064516129032256, 'number': 119} | {'precision': 0.7808098591549296, 'recall': 0.8328638497652582, 'f1': 0.8059972739663789, 'number': 1065} | 0.7120 | 0.7827 | 0.7457 | 0.8012 |
67
- | 0.3266 | 11.0 | 110 | 0.6796 | {'precision': 0.6776246023329798, 'recall': 0.7898640296662547, 'f1': 0.7294520547945205, 'number': 809} | {'precision': 0.2740740740740741, 'recall': 0.31092436974789917, 'f1': 0.29133858267716534, 'number': 119} | {'precision': 0.7656116338751069, 'recall': 0.8403755868544601, 'f1': 0.801253357206804, 'number': 1065} | 0.6992 | 0.7883 | 0.7410 | 0.8007 |
68
- | 0.3091 | 12.0 | 120 | 0.6863 | {'precision': 0.6908893709327549, 'recall': 0.7873918417799752, 'f1': 0.7359907567879839, 'number': 809} | {'precision': 0.3157894736842105, 'recall': 0.35294117647058826, 'f1': 0.33333333333333337, 'number': 119} | {'precision': 0.7680278019113814, 'recall': 0.8300469483568075, 'f1': 0.7978339350180504, 'number': 1065} | 0.7085 | 0.7842 | 0.7445 | 0.8036 |
69
- | 0.2903 | 13.0 | 130 | 0.7025 | {'precision': 0.6961748633879782, 'recall': 0.7873918417799752, 'f1': 0.7389791183294663, 'number': 809} | {'precision': 0.29931972789115646, 'recall': 0.3697478991596639, 'f1': 0.33082706766917297, 'number': 119} | {'precision': 0.7734855136084284, 'recall': 0.8272300469483568, 'f1': 0.7994555353901996, 'number': 1065} | 0.7097 | 0.7837 | 0.7449 | 0.8029 |
70
- | 0.2685 | 14.0 | 140 | 0.7073 | {'precision': 0.6958424507658644, 'recall': 0.7861557478368356, 'f1': 0.7382472431804992, 'number': 809} | {'precision': 0.2896551724137931, 'recall': 0.35294117647058826, 'f1': 0.31818181818181823, 'number': 119} | {'precision': 0.7727666955767563, 'recall': 0.8366197183098592, 'f1': 0.8034265103697025, 'number': 1065} | 0.7093 | 0.7873 | 0.7463 | 0.8034 |
71
- | 0.2678 | 15.0 | 150 | 0.7048 | {'precision': 0.6989130434782609, 'recall': 0.7948084054388134, 'f1': 0.7437825332562175, 'number': 809} | {'precision': 0.3049645390070922, 'recall': 0.36134453781512604, 'f1': 0.3307692307692308, 'number': 119} | {'precision': 0.7769973661106233, 'recall': 0.8309859154929577, 'f1': 0.8030852994555354, 'number': 1065} | 0.7141 | 0.7883 | 0.7493 | 0.8041 |
72
 
73
 
74
  ### Framework versions
 
2
  base_model: microsoft/layoutlm-base-uncased
3
  tags:
4
  - generated_from_trainer
 
 
5
  model-index:
6
  - name: layoutlm-funsd
7
  results: []
 
12
 
13
  # layoutlm-funsd
14
 
15
+ This model is a fine-tuned version of [microsoft/layoutlm-base-uncased](https://huggingface.co/microsoft/layoutlm-base-uncased) on the None dataset.
16
  It achieves the following results on the evaluation set:
17
+ - Loss: 3.0586
18
+ - Ame: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 3}
19
+ - Anguages: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 0}
20
+ - Ducation: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 2}
21
+ - Echnical skills: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 2}
22
+ - Escriptions: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 1}
23
+ - Esignation: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 4}
24
+ - Hone number: {'precision': 0.75, 'recall': 1.0, 'f1': 0.8571428571428571, 'number': 3}
25
+ - Ithub: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 1}
26
+ - Mail: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 2}
27
+ - Ocation: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 1}
28
+ - Ork experience company: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 4}
29
+ - Ork experience role: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 4}
30
+ - Rojects: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 1}
31
+ - Overall Precision: 0.12
32
+ - Overall Recall: 0.1071
33
+ - Overall F1: 0.1132
34
+ - Overall Accuracy: 0.1429
35
 
36
  ## Model description
37
 
 
51
 
52
  The following hyperparameters were used during training:
53
  - learning_rate: 3e-05
54
+ - train_batch_size: 2
55
+ - eval_batch_size: 2
56
  - seed: 42
57
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
58
  - lr_scheduler_type: linear
59
+ - num_epochs: 3
60
 
61
  ### Training results
62
 
63
+ | Training Loss | Epoch | Step | Validation Loss | Ame | Anguages | Ducation | Ear of experience | Echnical skills | Escriptions | Esignation | Hone number | Ithub | Mail | Ob | Ocation | Ork experience company | Ork experience role | Rojects | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
64
+ |:-------------:|:-----:|:----:|:---------------:|:---------------------------------------------------------:|:---------------------------------------------------------:|:---------------------------------------------------------:|:---------------------------------------------------------:|:---------------------------------------------------------:|:---------------------------------------------------------:|:---------------------------------------------------------:|:------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------:|:---------------------------------------------------------:|:---------------------------------------------------------:|:---------------------------------------------------------:|:---------------------------------------------------------:|:---------------------------------------------------------:|:---------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
65
+ | 3.2249 | 1.0 | 3 | 3.1464 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 3} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 0} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 2} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 0} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 2} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 1} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 4} | {'precision': 1.0, 'recall': 0.6666666666666666, 'f1': 0.8, 'number': 3} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 1} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 2} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 0} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 1} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 4} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 4} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 1} | 0.0909 | 0.0714 | 0.08 | 0.1071 |
66
+ | 2.9723 | 2.0 | 6 | 3.0839 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 3} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 0} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 2} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 0} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 2} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 1} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 4} | {'precision': 0.6666666666666666, 'recall': 0.6666666666666666, 'f1': 0.6666666666666666, 'number': 3} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 1} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 2} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 1} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 4} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 4} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 1} | 0.0769 | 0.0714 | 0.0741 | 0.0714 |
67
+ | 2.8482 | 3.0 | 9 | 3.0586 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 3} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 0} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 2} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 2} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 1} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 4} | {'precision': 0.75, 'recall': 1.0, 'f1': 0.8571428571428571, 'number': 3}| {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 1} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 2} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 1} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 4} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 4} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 1} | 0.12 | 0.1071 | 0.1132 | 0.1429 |
 
 
 
 
 
 
 
 
 
 
 
 
68
 
69
 
70
  ### Framework versions