Update README.md
Browse files
README.md
CHANGED
@@ -11,6 +11,10 @@ language:
|
|
11 |
- en
|
12 |
metrics:
|
13 |
- seqeval
|
|
|
|
|
|
|
|
|
14 |
pipeline_tag: token-classification
|
15 |
---
|
16 |
|
@@ -19,45 +23,182 @@ pipeline_tag: token-classification
|
|
19 |
This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on the twitter_pos_vcb dataset.
|
20 |
It achieves the following results on the evaluation set:
|
21 |
- Loss: 0.0533
|
22 |
-
- '''
|
23 |
-
-
|
24 |
-
-
|
25 |
-
-
|
26 |
-
-
|
27 |
-
-
|
28 |
-
-
|
29 |
-
-
|
30 |
-
-
|
31 |
-
-
|
32 |
-
-
|
33 |
-
-
|
34 |
-
-
|
35 |
-
-
|
36 |
-
-
|
37 |
-
-
|
38 |
-
-
|
39 |
-
-
|
40 |
-
-
|
41 |
-
-
|
42 |
-
-
|
43 |
-
-
|
44 |
-
-
|
45 |
-
-
|
46 |
-
-
|
47 |
-
-
|
48 |
-
-
|
49 |
-
-
|
50 |
-
-
|
51 |
-
-
|
52 |
-
-
|
53 |
-
-
|
54 |
-
-
|
55 |
-
-
|
56 |
-
|
57 |
-
-
|
58 |
-
-
|
59 |
-
-
|
60 |
-
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
61 |
|
62 |
## Model description
|
63 |
|
@@ -86,11 +227,11 @@ The following hyperparameters were used during training:
|
|
86 |
|
87 |
### Training results
|
88 |
|
89 |
-
| Training Loss | Epoch | Step
|
90 |
-
|
91 |
-
| 0.0617
|
92 |
-
| 0.0407
|
93 |
-
| 0.0246
|
94 |
|
95 |
|
96 |
### Framework versions
|
|
|
11 |
- en
|
12 |
metrics:
|
13 |
- seqeval
|
14 |
+
- accuracy
|
15 |
+
- f1
|
16 |
+
- recall
|
17 |
+
- precision
|
18 |
pipeline_tag: token-classification
|
19 |
---
|
20 |
|
|
|
23 |
This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on the twitter_pos_vcb dataset.
|
24 |
It achieves the following results on the evaluation set:
|
25 |
- Loss: 0.0533
|
26 |
+
- '''
|
27 |
+
- Precision: 0.9580645161290322
|
28 |
+
- Recall: 0.9519230769230769
|
29 |
+
- F1: 0.954983922829582
|
30 |
+
- Number': 312
|
31 |
+
- B
|
32 |
+
- Precision: 0.9658270558694287
|
33 |
+
- Recall: 0.9655240037652966
|
34 |
+
- F1: 0.9656755060411109
|
35 |
+
- Number: 25496
|
36 |
+
- Bd
|
37 |
+
- Precision: 0.9630099728014506
|
38 |
+
- Recall: 0.9572819033886085
|
39 |
+
- F1: 0.9601373949200036
|
40 |
+
- Number: 5548
|
41 |
+
- Bg
|
42 |
+
- Precision: 0.9836065573770492
|
43 |
+
- Recall: 0.9853434575313438
|
44 |
+
- F1: 0.9844742413549753
|
45 |
+
- Number: 5663
|
46 |
+
- Bn
|
47 |
+
- Precision: 0.9182209469153515
|
48 |
+
- Recall: 0.9116809116809117
|
49 |
+
- F1: 0.9149392423159399
|
50 |
+
- Number: 2106
|
51 |
+
- Bp
|
52 |
+
- Precision: 0.9672037914691943
|
53 |
+
- Recall: 0.9663488856619736
|
54 |
+
- F1: 0.9667761495704902
|
55 |
+
- Number': 15839
|
56 |
+
- Br
|
57 |
+
- Precision: 0.94
|
58 |
+
- Recall: 0.8785046728971962
|
59 |
+
- F1: 0.9082125603864735
|
60 |
+
- Number': 107
|
61 |
+
- Bs
|
62 |
+
- Precision: 0.9848484848484849
|
63 |
+
- Recall': 0.9701492537313433
|
64 |
+
- F1: 0.9774436090225564
|
65 |
+
- Number': 67
|
66 |
+
- Bz
|
67 |
+
- Precision: 0.9865819209039548
|
68 |
+
- Recall: 0.9850167459897762
|
69 |
+
- F1: 0.9857987121813531
|
70 |
+
- Number': 5673
|
71 |
+
- C
|
72 |
+
- Precision: 0.9993461203138623,
|
73 |
+
- Recall: 0.9993461203138623,
|
74 |
+
- F1: 0.9993461203138623,
|
75 |
+
- Number: 4588
|
76 |
+
- D
|
77 |
+
- Precision: 0.9876836325864372
|
78 |
+
- Recall: 0.9895926256318763
|
79 |
+
- F1: 0.988637207575195
|
80 |
+
- Number: 6726
|
81 |
+
- Dt
|
82 |
+
- Precision: 1.0
|
83 |
+
- Recall: 0.8
|
84 |
+
- F1: 0.888888888888889
|
85 |
+
- Number: 15
|
86 |
+
- H
|
87 |
+
- Precision: 0.9487382595903587
|
88 |
+
- Recall: 0.9305216426193119
|
89 |
+
- F1: 0.9395416596626883
|
90 |
+
- Number: 9010
|
91 |
+
- J
|
92 |
+
- Precision: 0.9803528468323978
|
93 |
+
- Recall: 0.980588754311382
|
94 |
+
- F1: 0.9804707863816818
|
95 |
+
- Number: 12467
|
96 |
+
- Jr
|
97 |
+
- Precision: 0.9400386847195358
|
98 |
+
- Recall: 0.9818181818181818
|
99 |
+
- F1': 0.9604743083003953
|
100 |
+
- Number': 495
|
101 |
+
- Js
|
102 |
+
- Precision: 0.9612141652613828
|
103 |
+
- Recall: 0.991304347826087
|
104 |
+
- F1: 0.9760273972602741
|
105 |
+
- Number': 575
|
106 |
+
- N
|
107 |
+
- Precision: 0.9795543362923471
|
108 |
+
- Recall: 0.9793769083475651
|
109 |
+
- F1: 0.9794656142847902
|
110 |
+
- Number': 38646
|
111 |
+
- Np
|
112 |
+
- Precision: 0.9330242966751918
|
113 |
+
- Recall: 0.9278334128119536
|
114 |
+
- F1: 0.9304216147286205
|
115 |
+
- Number': 6291
|
116 |
+
- Nps
|
117 |
+
- Precision: 0.75
|
118 |
+
- Recall: 0.23076923076923078
|
119 |
+
- F1: 0.3529411764705882
|
120 |
+
- Number: 26
|
121 |
+
- Ns
|
122 |
+
- Precision: 0.9691858990616282
|
123 |
+
- Recall: 0.9773657289002557
|
124 |
+
- F1: 0.9732586272762003
|
125 |
+
- Number': 7820
|
126 |
+
- O
|
127 |
+
- Precision: 0.9984323288625675
|
128 |
+
- Recall: 0.999302649930265
|
129 |
+
- F1: 0.9988672998170254
|
130 |
+
- Number: 5736
|
131 |
+
- Os
|
132 |
+
- Precision: 1.0
|
133 |
+
- Recall: 0.9952267303102625
|
134 |
+
- F1: 0.9976076555023923
|
135 |
+
- Number: 419
|
136 |
+
- P
|
137 |
+
- Precision: 0.9887869520897044
|
138 |
+
- Recall: 0.9918200408997955
|
139 |
+
- F1: 0.9903011740684022
|
140 |
+
- Number: 2934
|
141 |
+
- Rb
|
142 |
+
- Precision: 0.9971910112359551
|
143 |
+
- Recall: 0.9983929288871033
|
144 |
+
- F1: 0.9977916081108211
|
145 |
+
- Number: 2489
|
146 |
+
- Rl
|
147 |
+
- Precision: 1.0
|
148 |
+
- Recall: 0.9997228381374723
|
149 |
+
- F1: 0.9998613998613999
|
150 |
+
- Number: 3608
|
151 |
+
- Rp
|
152 |
+
- Precision: 0.9979960600502683
|
153 |
+
- Recall: 0.9980638586956522
|
154 |
+
- F1: 0.9980299582215278
|
155 |
+
- Number: 29440
|
156 |
+
- Rp$
|
157 |
+
- Precision: 0.9975770162686051
|
158 |
+
- Recall: 0.9972318339100346
|
159 |
+
- F1: 0.9974043952240872
|
160 |
+
- Number: 5780
|
161 |
+
- Sr
|
162 |
+
- Precision: 0.9998923110058152
|
163 |
+
- Recall: 0.9998384752059442
|
164 |
+
- F1: 0.9998653923812088
|
165 |
+
- Number: 18573
|
166 |
+
- T
|
167 |
+
- Precision: 0.9987569919204475
|
168 |
+
- Recall: 0.9984811874352779
|
169 |
+
- F1: 0.9986190706345371
|
170 |
+
- Number: 28970
|
171 |
+
- W
|
172 |
+
- Precision: 0.0
|
173 |
+
- Recall: 0.0
|
174 |
+
- F1: 0.0
|
175 |
+
- number: 1
|
176 |
+
- X
|
177 |
+
- Precision: 0.9466666666666667,
|
178 |
+
- Recall: 0.9594594594594594,
|
179 |
+
- F1 0.9530201342281879,
|
180 |
+
- Number: 74}
|
181 |
+
- Ym
|
182 |
+
- Precision: 0.0
|
183 |
+
- Recall: 0.0
|
184 |
+
- F1: 0.0
|
185 |
+
- Number: 5
|
186 |
+
- ' '
|
187 |
+
- Precision: 0.9951481772882245
|
188 |
+
- Recall: 0.9949524745984923
|
189 |
+
- F1: 0.9950503163208444
|
190 |
+
- Number: 15255
|
191 |
+
- '`'
|
192 |
+
- Precision: 0.9540229885057471
|
193 |
+
- Recall: 0.9595375722543352
|
194 |
+
- F1: 0.956772334293948
|
195 |
+
- Number: 173
|
196 |
+
|
197 |
+
- Overall
|
198 |
+
- Precision: 0.9828
|
199 |
+
- Recall: 0.9820
|
200 |
+
- F1: 0.9824
|
201 |
+
- Accuracy: 0.9860
|
202 |
|
203 |
## Model description
|
204 |
|
|
|
227 |
|
228 |
### Training results
|
229 |
|
230 |
+
| Training Loss | Epoch | Step | Validation Loss | ''' Precision | ''' Recall | ''' F1 | ''' Number | B Precision | B Recall | B F1 | B Number | Bd Precision | Bd Recall | Bd F1 | Bd Number | Bg Precision | Bg Recall | Bg F1 | Bg Number | Bn Precision | Bn Recall | Bn F1 | Bn Number | Bp Precision | Bp Recall | Bp F1 | Bp Number | Br Precision | Br Recall | Br F1 | Br Number | Bs precision | Bs Recall | Bs F1 | Bs Number | Bz Precision | Bz Recall | Bz F1 | Bz Number | C Precision | C Recall | C F1 | C Number | D Precision | D Recall | D F1 | D Number | Dt Precision | Dt Recall | Dt F1 | Dt Number | H Precision | H Recall | H F1 | H Number | J Precision | J Recall | J F1 | J Number | Jr Precision | Jr Recall | Jr F1 | Jr Number | Js Precision | Js Recall | Js F1 | Js Number | N Precision | N Recall | N F1 | N Number | Np Precision | Np Recall | Np F1 | Np Number | Nps Precision | Nps Recall | Nps F1 | Nps Number | Ns Precision | Ns Recall | Ns F1 | Ns Number | O Precision | O Recall | O F1 | O Number | Os Precision | Os Recall | Os F1 | Os Number | P Precision | P Recall | P F1 | P Number | Rb Precision | Rb Recall | Rb f1 | Rb Number | Rl Precision | Rl Recall | Rl F1 | Rl Number | Rp Precision | Rp Recall | Rp F1 | Rp Number | Rp$ Precision | Rp$ Recall | Rp$ F1 | Rp$ Number | Sr Precision | Sr Recall | Sr F1 | Sr Number | T Precision | T recall | T F1 | T Number | W Precision | W Recall | W F1 | W Number | X Precision | X Recall | X F1 | X Number | Ym Precision | Ym Recall | Ym F1 | Ym Number | ' ' Precision | ' ' Recall | ' ' F1 | ' ' Number | '`' Precision | '`' Recall | '`' F1 | '`' Number | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
|
231 |
+
|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|
|
232 |
+
| 0.0617 | 1.0 | 7477 | 0.0595 | 0.9331 | 0.9391 | 0.9361 | 312 | 0.9563 | 0.9536 | 0.9550 | 25496 | 0.9716 | 0.9322 | 0.9515 | 5548 | 0.9811 | 0.9786 | 0.9798 | 5663 | 0.8725 | 0.9231 | 0.8971 | 2106 | 0.9556 | 0.9586 | 0.9571 | 15839 | 0.8879 | 0.8879 | 0.8879 | 107 | 0.8590 | 1.0 | 0.9241 | 67 | 0.9793 | 0.9834 | 0.9814 | 5673 | 0.9985 | 0.9991 | 0.9988 | 4588 | 0.9818 | 0.9886 | 0.9852 | 6726 | 1.0 | 0.8 | 0.8889 | 15 | 0.9391 | 0.9105 | 0.9246 | 9010 | 0.9707 | 0.9766 | 0.9736 | 12467 | 0.9212 | 0.9677 | 0.9438 | 495 | 0.9227 | 0.9757 | 0.9484 | 575 | 0.9754 | 0.9738 | 0.9746 | 38646 | 0.9158 | 0.9200 | 0.9179 | 6291 | 0.0 | 0.0 | 0.0 | 26 | 0.9657 | 0.9688 | 0.9673 | 7820 | 0.9972 | 0.9990 | 0.9981 | 5736 | 1.0 | 0.9928 | 0.9964 | 419 | 0.9771 | 0.9908 | 0.9839 | 2934 | 0.9948 | 0.9968 | 0.9958 | 2489 | 1.0 | 0.9997 | 0.9999 | 3608 | 0.9970 | 0.9976 | 0.9973 | 29440 | 0.9974 | 0.9954 | 0.9964 | 5780 | 0.9998 | 0.9998 | 0.9998 | 18573 | 0.9977 | 0.9982 | 0.9979 | 28970 | 0.0 | 0.0 | 0.0 | 1 | 0.8861 | 0.9459 | 0.9150 | 74 | 0.0 | 0.0 | 0.0 | 5 | 0.9936 | 0.9926 | 0.9931 | 15255 | 0.9540 | 0.9595 | 0.9568 | 173 | 0.9779 | 0.9772 | 0.9775 | 0.9821 |
|
233 |
+
| 0.0407 | 2.0 | 14954 | 0.0531 | 0.9605 | 0.9359 | 0.9481 | 312 | 0.9599 | 0.9646 | 0.9622 | 25496 | 0.9674 | 0.9459 | 0.9565 | 5548 | 0.9834 | 0.9825 | 0.9830 | 5663 | 0.8920 | 0.9259 | 0.9087 | 2106 | 0.9728 | 0.9569 | 0.9648 | 15839 | 0.9592 | 0.8785 | 0.9171 | 107 | 0.9429 | 0.9851 | 0.9635 | 67 | 0.9890 | 0.9825 | 0.9858 | 5673 | 0.9991 | 0.9993 | 0.9992 | 4588 | 0.9855 | 0.9896 | 0.9875 | 6726 | 1.0 | 0.8 | 0.8889 | 15 | 0.9498 | 0.9303 | 0.9399 | 9010 | 0.9776 | 0.9797 | 0.9786 | 12467 | 0.9125 | 0.9899 | 0.9496 | 495 | 0.9481 | 0.9843 | 0.9659 | 575 | 0.9788 | 0.9771 | 0.9779 | 38646 | 0.9252 | 0.9285 | 0.9268 | 6291 | 0.5 | 0.2308 | 0.3158 | 26 | 0.96534 | 0.9769 | 0.9711 | 7820 | 0.9976 | 0.9993 | 0.9984 | 5736 | 0.9929 | 0.9952 | 0.9940 | 419 | 0.9861 | 0.9928 | 0.9895 | 2934 | 0.9972 | 0.9984 | 0.9978 | 2489 | 1.0 | 0.9997 | 0.9999 | 3608 | 0.9986 | 0.9982 | 0.9984 | 29440 | 0.9964 | 0.9978 | 0.9971 | 5780 | 0.9999 | 0.9999 | 0.9999 | 18573 | 0.9985 | 0.9983 | 0.9984 | 28970 | 0.0 | 0.0 | 0.0 | 1 | 0.9114 | 0.9730 | 0.9412 | 74 | 0.0 | 0.0 | 0.0 | 5 | 0.9949 | 0.9961 | 0.9955 | 15255 | 0.9651 | 0.9595 | 0.9623 | 173 | 0.9817 | 0.9808 | 0.9813 | 0.9850 |
|
234 |
+
| 0.0246 | 3.0 | 22431 | 0.0533 | 0.9581 | 0.9519 | 0.9550 | 312 | 0.9658 | 0.9655 | 0.9657 | 25496 | 0.9630 | 0.9573 | 0.9601 | 5548 | 0.9836 | 0.9853 | 0.9845 | 5663 | 0.9182 | 0.9117 | 0.9149 | 2106 | 0.9672 | 0.9663 | 0.9668 | 15839 | 0.94 | 0.8785 | 0.9082 | 107 | 0.9848 | 0.9701 | 0.9774 | 67 | 0.9866 | 0.9850 | 0.9858 | 5673 | 0.9993 | 0.9993 | 0.9993 | 4588 | 0.9877 | 0.9896 | 0.9886 | 6726 | 1.0 | 0.8 | 0.8889 | 15 | 0.9487 | 0.9305 | 0.9395 | 9010 | 0.9804 | 0.9806 | 0.9805 | 12467 | 0.9400 | 0.9818 | 0.9605 | 495 | 0.9612 | 0.9913 | 0.9760 | 575 | 0.9796 | 0.9794 | 0.9795 | 38646 | 0.9330 | 0.9278 | 0.9304 | 6291 | 0.75 | 0.2308 | 0.3529 | 26 | 0.9692 | 0.9774 | 0.9733 | 7820 | 0.9984 | 0.9993 | 0.9989 | 5736 | 1.0 | 0.9952 | 0.9976 | 419 | 0.9888 | 0.9918 | 0.9903 | 2934 | 0.9972 | 0.9984 | 0.9978 | 2489 | 1.0 | 0.9997 | 0.9999 | 3608 | 0.9980 | 0.9981 | 0.9981 | 29440 | 0.9976 | 0.9972 | 0.9974 | 5780 | 0.9999 | 0.9998 | 0.9999 | 18573 | 0.9988 | 0.9985 | 0.9986 | 28970 | 0.0 | 0.0 | 0.0 | 1 | 0.9467 | 0.9595 | 0.9530 | 74 | 0.0 | 0.0 | 0.0 | 5 | 0.9951 | 0.9950 | 0.9951 | 15255 | 0.9540 | 0.9595 | 0.9568 | 173 | 0.9828 | 0.9820 | 0.9824 | 0.9860 |
|
235 |
|
236 |
|
237 |
### Framework versions
|