Update Model
#3
by
stefan-it
- opened
- README.md +9 -10
- loss.tsv +151 -151
- pytorch_model.bin +2 -2
- training.log +0 -0
README.md
CHANGED
@@ -3,10 +3,10 @@ tags:
|
|
3 |
- flair
|
4 |
- token-classification
|
5 |
- sequence-tagger-model
|
6 |
-
language:
|
7 |
-
- en
|
8 |
-
- de
|
9 |
-
- fr
|
10 |
- it
|
11 |
- nl
|
12 |
- pl
|
@@ -26,7 +26,7 @@ widget:
|
|
26 |
|
27 |
This is the default multilingual universal part-of-speech tagging model that ships with [Flair](https://github.com/flairNLP/flair/).
|
28 |
|
29 |
-
F1-Score: **
|
30 |
|
31 |
Predicts universal POS tags:
|
32 |
|
@@ -94,14 +94,14 @@ Token[6]: "say" → VERB (0.9998)
|
|
94 |
Token[7]: "." → PUNCT (1.0)
|
95 |
```
|
96 |
|
97 |
-
So, the words "*Ich*" and "*they*" are labeled as **pronouns** (PRON), while "*liebe*" and "*say*" are labeled as **verbs** (VERB) in the multilingual sentence "*Ich liebe Berlin, as they say*".
|
98 |
|
99 |
|
100 |
---
|
101 |
|
102 |
### Training: Script to train this model
|
103 |
|
104 |
-
The following Flair script was used to train this model:
|
105 |
|
106 |
```python
|
107 |
from flair.data import MultiCorpus
|
@@ -129,11 +129,10 @@ corpus = MultiCorpus([
|
|
129 |
tag_type = 'upos'
|
130 |
|
131 |
# 3. make the tag dictionary from the corpus
|
132 |
-
tag_dictionary = corpus.
|
133 |
|
134 |
# 4. initialize each embedding we use
|
135 |
embedding_types = [
|
136 |
-
|
137 |
# contextual string embeddings, forward
|
138 |
FlairEmbeddings('multi-forward'),
|
139 |
|
@@ -141,7 +140,7 @@ embedding_types = [
|
|
141 |
FlairEmbeddings('multi-backward'),
|
142 |
]
|
143 |
|
144 |
-
# embedding stack consists of Flair
|
145 |
embeddings = StackedEmbeddings(embeddings=embedding_types)
|
146 |
|
147 |
# 5. initialize sequence tagger
|
|
|
3 |
- flair
|
4 |
- token-classification
|
5 |
- sequence-tagger-model
|
6 |
+
language:
|
7 |
+
- en
|
8 |
+
- de
|
9 |
+
- fr
|
10 |
- it
|
11 |
- nl
|
12 |
- pl
|
|
|
26 |
|
27 |
This is the default multilingual universal part-of-speech tagging model that ships with [Flair](https://github.com/flairNLP/flair/).
|
28 |
|
29 |
+
F1-Score: **96.87** (12 UD Treebanks covering English, German, French, Italian, Dutch, Polish, Spanish, Swedish, Danish, Norwegian, Finnish and Czech)
|
30 |
|
31 |
Predicts universal POS tags:
|
32 |
|
|
|
94 |
Token[7]: "." → PUNCT (1.0)
|
95 |
```
|
96 |
|
97 |
+
So, the words "*Ich*" and "*they*" are labeled as **pronouns** (PRON), while "*liebe*" and "*say*" are labeled as **verbs** (VERB) in the multilingual sentence "*Ich liebe Berlin, as they say*".
|
98 |
|
99 |
|
100 |
---
|
101 |
|
102 |
### Training: Script to train this model
|
103 |
|
104 |
+
The following Flair script was used to train this model:
|
105 |
|
106 |
```python
|
107 |
from flair.data import MultiCorpus
|
|
|
129 |
tag_type = 'upos'
|
130 |
|
131 |
# 3. make the tag dictionary from the corpus
|
132 |
+
tag_dictionary = corpus.make_label_dictionary(label_type=tag_type)
|
133 |
|
134 |
# 4. initialize each embedding we use
|
135 |
embedding_types = [
|
|
|
136 |
# contextual string embeddings, forward
|
137 |
FlairEmbeddings('multi-forward'),
|
138 |
|
|
|
140 |
FlairEmbeddings('multi-backward'),
|
141 |
]
|
142 |
|
143 |
+
# embedding stack consists of Flair embeddings
|
144 |
embeddings = StackedEmbeddings(embeddings=embedding_types)
|
145 |
|
146 |
# 5. initialize sequence tagger
|
loss.tsv
CHANGED
@@ -1,151 +1,151 @@
|
|
1 |
-
EPOCH TIMESTAMP
|
2 |
-
|
3 |
-
|
4 |
-
|
5 |
-
|
6 |
-
|
7 |
-
|
8 |
-
|
9 |
-
|
10 |
-
|
11 |
-
|
12 |
-
|
13 |
-
|
14 |
-
|
15 |
-
|
16 |
-
|
17 |
-
|
18 |
-
|
19 |
-
|
20 |
-
|
21 |
-
|
22 |
-
|
23 |
-
|
24 |
-
|
25 |
-
|
26 |
-
|
27 |
-
|
28 |
-
|
29 |
-
|
30 |
-
|
31 |
-
|
32 |
-
|
33 |
-
|
34 |
-
|
35 |
-
|
36 |
-
|
37 |
-
|
38 |
-
|
39 |
-
|
40 |
-
|
41 |
-
|
42 |
-
|
43 |
-
|
44 |
-
|
45 |
-
|
46 |
-
|
47 |
-
|
48 |
-
|
49 |
-
|
50 |
-
|
51 |
-
|
52 |
-
|
53 |
-
|
54 |
-
|
55 |
-
|
56 |
-
|
57 |
-
|
58 |
-
|
59 |
-
|
60 |
-
|
61 |
-
|
62 |
-
|
63 |
-
|
64 |
-
|
65 |
-
|
66 |
-
|
67 |
-
|
68 |
-
|
69 |
-
|
70 |
-
|
71 |
-
|
72 |
-
|
73 |
-
|
74 |
-
|
75 |
-
|
76 |
-
|
77 |
-
|
78 |
-
|
79 |
-
|
80 |
-
|
81 |
-
|
82 |
-
|
83 |
-
|
84 |
-
|
85 |
-
|
86 |
-
|
87 |
-
|
88 |
-
|
89 |
-
|
90 |
-
|
91 |
-
|
92 |
-
|
93 |
-
|
94 |
-
|
95 |
-
|
96 |
-
|
97 |
-
|
98 |
-
|
99 |
-
|
100 |
-
|
101 |
-
|
102 |
-
|
103 |
-
|
104 |
-
|
105 |
-
|
106 |
-
|
107 |
-
|
108 |
-
|
109 |
-
|
110 |
-
|
111 |
-
|
112 |
-
|
113 |
-
|
114 |
-
|
115 |
-
|
116 |
-
|
117 |
-
|
118 |
-
|
119 |
-
|
120 |
-
|
121 |
-
|
122 |
-
|
123 |
-
|
124 |
-
|
125 |
-
|
126 |
-
|
127 |
-
|
128 |
-
|
129 |
-
|
130 |
-
|
131 |
-
|
132 |
-
|
133 |
-
|
134 |
-
|
135 |
-
|
136 |
-
|
137 |
-
|
138 |
-
|
139 |
-
|
140 |
-
|
141 |
-
|
142 |
-
|
143 |
-
|
144 |
-
|
145 |
-
|
146 |
-
|
147 |
-
|
148 |
-
|
149 |
-
|
150 |
-
|
151 |
-
|
|
|
1 |
+
EPOCH TIMESTAMP LEARNING_RATE TRAIN_LOSS
|
2 |
+
1 00:11:58 0.1000 0.4237
|
3 |
+
2 00:36:04 0.1000 0.3138
|
4 |
+
3 01:00:09 0.1000 0.2869
|
5 |
+
4 01:24:15 0.1000 0.2718
|
6 |
+
5 01:48:19 0.1000 0.2614
|
7 |
+
6 02:12:23 0.1000 0.2538
|
8 |
+
7 02:36:31 0.1000 0.2477
|
9 |
+
8 03:00:37 0.1000 0.2433
|
10 |
+
9 03:24:44 0.1000 0.2389
|
11 |
+
10 03:48:50 0.1000 0.2355
|
12 |
+
11 04:12:56 0.1000 0.2322
|
13 |
+
12 04:37:02 0.1000 0.2295
|
14 |
+
13 05:01:09 0.1000 0.2276
|
15 |
+
14 05:25:13 0.1000 0.2256
|
16 |
+
15 05:49:17 0.1000 0.2236
|
17 |
+
16 06:13:25 0.1000 0.2215
|
18 |
+
17 06:37:30 0.1000 0.2203
|
19 |
+
18 07:01:38 0.1000 0.2190
|
20 |
+
19 07:25:43 0.1000 0.2176
|
21 |
+
20 07:49:49 0.1000 0.2165
|
22 |
+
21 08:13:57 0.1000 0.2152
|
23 |
+
22 08:38:03 0.1000 0.2142
|
24 |
+
23 09:02:11 0.1000 0.2128
|
25 |
+
24 09:26:16 0.1000 0.2114
|
26 |
+
25 09:50:23 0.1000 0.2116
|
27 |
+
26 10:14:28 0.1000 0.2099
|
28 |
+
27 10:38:33 0.1000 0.2099
|
29 |
+
28 11:02:37 0.1000 0.2089
|
30 |
+
29 11:26:44 0.1000 0.2082
|
31 |
+
30 11:50:52 0.1000 0.2078
|
32 |
+
31 12:14:56 0.1000 0.2071
|
33 |
+
32 12:39:00 0.1000 0.2064
|
34 |
+
33 13:03:07 0.1000 0.2053
|
35 |
+
34 13:27:14 0.1000 0.2052
|
36 |
+
35 13:51:21 0.1000 0.2043
|
37 |
+
36 14:15:27 0.1000 0.2042
|
38 |
+
37 14:39:31 0.1000 0.2037
|
39 |
+
38 15:03:36 0.1000 0.2032
|
40 |
+
39 15:27:41 0.1000 0.2025
|
41 |
+
40 15:51:47 0.1000 0.2023
|
42 |
+
41 16:15:52 0.1000 0.2017
|
43 |
+
42 16:39:56 0.1000 0.2015
|
44 |
+
43 17:04:01 0.1000 0.2011
|
45 |
+
44 17:28:07 0.1000 0.2007
|
46 |
+
45 17:52:13 0.1000 0.2002
|
47 |
+
46 18:16:20 0.1000 0.1999
|
48 |
+
47 18:40:25 0.1000 0.1996
|
49 |
+
48 19:04:30 0.1000 0.1996
|
50 |
+
49 19:28:33 0.1000 0.1986
|
51 |
+
50 19:52:39 0.1000 0.1994
|
52 |
+
51 20:16:47 0.1000 0.1985
|
53 |
+
52 20:40:52 0.1000 0.1980
|
54 |
+
53 21:05:00 0.1000 0.1977
|
55 |
+
54 21:29:04 0.1000 0.1980
|
56 |
+
55 21:53:10 0.1000 0.1971
|
57 |
+
56 22:17:14 0.1000 0.1971
|
58 |
+
57 22:41:19 0.1000 0.1971
|
59 |
+
58 23:05:25 0.1000 0.1967
|
60 |
+
59 23:29:26 0.1000 0.1963
|
61 |
+
60 23:53:31 0.1000 0.1962
|
62 |
+
61 00:17:34 0.1000 0.1958
|
63 |
+
62 00:41:38 0.1000 0.1956
|
64 |
+
63 01:05:45 0.1000 0.1958
|
65 |
+
64 01:29:52 0.1000 0.1951
|
66 |
+
65 01:54:01 0.1000 0.1949
|
67 |
+
66 02:18:05 0.1000 0.1948
|
68 |
+
67 02:42:11 0.1000 0.1946
|
69 |
+
68 03:06:20 0.1000 0.1947
|
70 |
+
69 03:30:26 0.1000 0.1947
|
71 |
+
70 03:54:29 0.1000 0.1944
|
72 |
+
71 04:18:35 0.1000 0.1941
|
73 |
+
72 04:42:43 0.1000 0.1940
|
74 |
+
73 05:06:48 0.1000 0.1935
|
75 |
+
74 05:30:53 0.1000 0.1938
|
76 |
+
75 05:54:57 0.1000 0.1939
|
77 |
+
76 06:19:06 0.1000 0.1930
|
78 |
+
77 06:43:09 0.1000 0.1930
|
79 |
+
78 07:07:16 0.1000 0.1931
|
80 |
+
79 07:31:22 0.1000 0.1930
|
81 |
+
80 07:55:27 0.1000 0.1926
|
82 |
+
81 08:19:29 0.1000 0.1929
|
83 |
+
82 08:43:34 0.1000 0.1921
|
84 |
+
83 09:07:40 0.1000 0.1925
|
85 |
+
84 09:31:42 0.1000 0.1928
|
86 |
+
85 09:55:48 0.1000 0.1925
|
87 |
+
86 10:19:54 0.1000 0.1920
|
88 |
+
87 10:43:58 0.1000 0.1922
|
89 |
+
88 11:08:01 0.1000 0.1920
|
90 |
+
89 11:32:09 0.1000 0.1916
|
91 |
+
90 11:56:13 0.1000 0.1915
|
92 |
+
91 12:20:17 0.1000 0.1916
|
93 |
+
92 12:44:24 0.1000 0.1916
|
94 |
+
93 13:08:31 0.1000 0.1917
|
95 |
+
94 13:32:34 0.1000 0.1918
|
96 |
+
95 13:56:40 0.0500 0.1802
|
97 |
+
96 14:20:45 0.0500 0.1768
|
98 |
+
97 14:44:51 0.0500 0.1755
|
99 |
+
98 15:08:58 0.0500 0.1735
|
100 |
+
99 15:33:05 0.0500 0.1730
|
101 |
+
100 15:57:10 0.0500 0.1726
|
102 |
+
101 16:21:15 0.0500 0.1714
|
103 |
+
102 16:45:23 0.0500 0.1711
|
104 |
+
103 17:09:31 0.0500 0.1704
|
105 |
+
104 17:33:39 0.0500 0.1702
|
106 |
+
105 17:57:44 0.0500 0.1703
|
107 |
+
106 18:21:47 0.0500 0.1691
|
108 |
+
107 18:45:51 0.0500 0.1694
|
109 |
+
108 19:09:55 0.0500 0.1687
|
110 |
+
109 19:34:01 0.0500 0.1686
|
111 |
+
110 19:58:07 0.0500 0.1682
|
112 |
+
111 20:22:12 0.0500 0.1681
|
113 |
+
112 20:46:19 0.0500 0.1673
|
114 |
+
113 21:10:26 0.0500 0.1671
|
115 |
+
114 21:34:32 0.0500 0.1668
|
116 |
+
115 21:58:40 0.0500 0.1665
|
117 |
+
116 22:22:46 0.0500 0.1667
|
118 |
+
117 22:46:55 0.0500 0.1662
|
119 |
+
118 23:11:00 0.0500 0.1658
|
120 |
+
119 23:35:04 0.0500 0.1660
|
121 |
+
120 23:59:12 0.0500 0.1657
|
122 |
+
121 00:23:14 0.0500 0.1659
|
123 |
+
122 00:47:20 0.0500 0.1654
|
124 |
+
123 01:11:23 0.0500 0.1652
|
125 |
+
124 01:35:31 0.0500 0.1650
|
126 |
+
125 01:59:40 0.0500 0.1647
|
127 |
+
126 02:23:48 0.0500 0.1646
|
128 |
+
127 02:47:53 0.0500 0.1641
|
129 |
+
128 03:11:58 0.0500 0.1641
|
130 |
+
129 03:36:06 0.0500 0.1642
|
131 |
+
130 04:00:12 0.0500 0.1637
|
132 |
+
131 04:24:17 0.0500 0.1637
|
133 |
+
132 04:48:23 0.0500 0.1632
|
134 |
+
133 05:12:29 0.0500 0.1633
|
135 |
+
134 05:36:36 0.0500 0.1629
|
136 |
+
135 06:00:40 0.0500 0.1629
|
137 |
+
136 06:24:46 0.0500 0.1626
|
138 |
+
137 06:48:53 0.0500 0.1626
|
139 |
+
138 07:12:58 0.0500 0.1625
|
140 |
+
139 07:37:01 0.0500 0.1617
|
141 |
+
140 08:01:07 0.0500 0.1620
|
142 |
+
141 08:25:13 0.0500 0.1619
|
143 |
+
142 08:49:18 0.0500 0.1617
|
144 |
+
143 09:13:22 0.0500 0.1616
|
145 |
+
144 09:37:28 0.0500 0.1613
|
146 |
+
145 10:01:34 0.0500 0.1606
|
147 |
+
146 10:25:40 0.0500 0.1612
|
148 |
+
147 10:49:46 0.0500 0.1611
|
149 |
+
148 11:13:52 0.0500 0.1608
|
150 |
+
149 11:37:56 0.0500 0.1607
|
151 |
+
150 12:02:00 0.0250 0.1558
|
pytorch_model.bin
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
-
size
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:a01ffffa385dabfebe791703cf37f82b5b4811de243a46967d907544112bb28e
|
3 |
+
size 253762544
|
training.log
CHANGED
The diff for this file is too large to render.
See raw diff
|
|