initial model commit
Browse files- README.md +166 -0
- loss.tsv +151 -0
- pytorch_model.bin +3 -0
- training.log +0 -0
README.md
ADDED
@@ -0,0 +1,166 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
tags:
|
3 |
+
- flair
|
4 |
+
- token-classification
|
5 |
+
- sequence-tagger-model
|
6 |
+
language: en de fr it nl pl es sw da no fi cz
|
7 |
+
datasets:
|
8 |
+
- ontonotes
|
9 |
+
inference: false
|
10 |
+
---
|
11 |
+
|
12 |
+
## Multilingual Universal Part-of-Speech Tagging in Flair (default model)
|
13 |
+
|
14 |
+
This is the default multilingual universal part-of-speech tagging model that ships with [Flair](https://github.com/flairNLP/flair/).
|
15 |
+
|
16 |
+
F1-Score: **98,47** (12 UD Treebanks covering English, German, French, Italian, Dutch, Polish, Spanish, Swedish, Danish, Norwegian, Finnish and Czech)
|
17 |
+
|
18 |
+
Predicts universal POS tags:
|
19 |
+
|
20 |
+
| **tag** | **meaning** |
|
21 |
+
|---------------------------------|-----------|
|
22 |
+
|ADJ | adjective |
|
23 |
+
| ADP | adposition |
|
24 |
+
| ADV | adverb |
|
25 |
+
| AUX | auxiliary |
|
26 |
+
| CCONJ | coordinating conjunction |
|
27 |
+
| DET | determiner |
|
28 |
+
| INTJ | interjection |
|
29 |
+
| NOUN | noun |
|
30 |
+
| NUM | numeral |
|
31 |
+
| PART | particle |
|
32 |
+
| PRON | pronoun |
|
33 |
+
| PROPN | proper noun |
|
34 |
+
| PUNCT | punctuation |
|
35 |
+
| SCONJ | subordinating conjunction |
|
36 |
+
| SYM | symbol |
|
37 |
+
| VERB | verb |
|
38 |
+
| X | other |
|
39 |
+
|
40 |
+
|
41 |
+
|
42 |
+
Based on [Flair embeddings](https://www.aclweb.org/anthology/C18-1139/) and LSTM-CRF.
|
43 |
+
|
44 |
+
---
|
45 |
+
|
46 |
+
### Demo: How to use in Flair
|
47 |
+
|
48 |
+
Requires: **[Flair](https://github.com/flairNLP/flair/)** (`pip install flair`)
|
49 |
+
|
50 |
+
```python
|
51 |
+
from flair.data import Sentence
|
52 |
+
from flair.models import SequenceTagger
|
53 |
+
|
54 |
+
# load tagger
|
55 |
+
tagger = SequenceTagger.load("flair/upos-multi")
|
56 |
+
|
57 |
+
# make example sentence
|
58 |
+
sentence = Sentence("Ich liebe Berlin, as they say. ")
|
59 |
+
|
60 |
+
# predict NER tags
|
61 |
+
tagger.predict(sentence)
|
62 |
+
|
63 |
+
# print sentence
|
64 |
+
print(sentence)
|
65 |
+
|
66 |
+
# print predicted NER spans
|
67 |
+
print('The following NER tags are found:')
|
68 |
+
# iterate over entities and print
|
69 |
+
for entity in sentence.get_spans('pos'):
|
70 |
+
print(entity)
|
71 |
+
```
|
72 |
+
|
73 |
+
This yields the following output:
|
74 |
+
```
|
75 |
+
Span [1]: "Ich" [− Labels: PRON (0.9999)]
|
76 |
+
Span [2]: "liebe" [− Labels: VERB (0.9999)]
|
77 |
+
Span [3]: "Berlin" [− Labels: PROPN (0.9997)]
|
78 |
+
Span [4]: "," [− Labels: PUNCT (1.0)]
|
79 |
+
Span [5]: "as" [− Labels: SCONJ (0.9991)]
|
80 |
+
Span [6]: "they" [− Labels: PRON (0.9998)]
|
81 |
+
Span [7]: "say" [− Labels: VERB (0.9998)]
|
82 |
+
Span [8]: "." [− Labels: PUNCT (1.0)]
|
83 |
+
```
|
84 |
+
|
85 |
+
So, the words "*Ich*" and "*they*" are labeled as **pronouns** (PRON), while "*liebe*" and "*say*" are labeled as **verbs** (VERB) in the multilingual sentence "*Ich liebe Berlin, as they say*".
|
86 |
+
|
87 |
+
|
88 |
+
---
|
89 |
+
|
90 |
+
### Training: Script to train this model
|
91 |
+
|
92 |
+
The following Flair script was used to train this model:
|
93 |
+
|
94 |
+
```python
|
95 |
+
from flair.data import Corpus
|
96 |
+
from flair.datasets import ColumnCorpus
|
97 |
+
from flair.embeddings import WordEmbeddings, StackedEmbeddings, FlairEmbeddings
|
98 |
+
|
99 |
+
# 1. load the corpus (Ontonotes does not ship with Flair, you need to download and reformat into a column format yourself)
|
100 |
+
corpus: Corpus = ColumnCorpus(
|
101 |
+
"resources/tasks/onto-ner",
|
102 |
+
column_format={0: "text", 1: "pos", 2: "upos", 3: "ner"},
|
103 |
+
tag_to_bioes="ner",
|
104 |
+
)
|
105 |
+
|
106 |
+
# 2. what tag do we want to predict?
|
107 |
+
tag_type = 'upos'
|
108 |
+
|
109 |
+
# 3. make the tag dictionary from the corpus
|
110 |
+
tag_dictionary = corpus.make_tag_dictionary(tag_type=tag_type)
|
111 |
+
|
112 |
+
# 4. initialize each embedding we use
|
113 |
+
embedding_types = [
|
114 |
+
|
115 |
+
# contextual string embeddings, forward
|
116 |
+
FlairEmbeddings('multi-forward'),
|
117 |
+
|
118 |
+
# contextual string embeddings, backward
|
119 |
+
FlairEmbeddings('multi-backward'),
|
120 |
+
]
|
121 |
+
|
122 |
+
# embedding stack consists of Flair and GloVe embeddings
|
123 |
+
embeddings = StackedEmbeddings(embeddings=embedding_types)
|
124 |
+
|
125 |
+
# 5. initialize sequence tagger
|
126 |
+
from flair.models import SequenceTagger
|
127 |
+
|
128 |
+
tagger = SequenceTagger(hidden_size=256,
|
129 |
+
embeddings=embeddings,
|
130 |
+
tag_dictionary=tag_dictionary,
|
131 |
+
tag_type=tag_type)
|
132 |
+
|
133 |
+
# 6. initialize trainer
|
134 |
+
from flair.trainers import ModelTrainer
|
135 |
+
|
136 |
+
trainer = ModelTrainer(tagger, corpus)
|
137 |
+
|
138 |
+
# 7. run training
|
139 |
+
trainer.train('resources/taggers/upos-english-fast',
|
140 |
+
train_with_dev=True,
|
141 |
+
max_epochs=150)
|
142 |
+
```
|
143 |
+
|
144 |
+
|
145 |
+
|
146 |
+
---
|
147 |
+
|
148 |
+
### Cite
|
149 |
+
|
150 |
+
Please cite the following paper when using this model.
|
151 |
+
|
152 |
+
```
|
153 |
+
@inproceedings{akbik2018coling,
|
154 |
+
title={Contextual String Embeddings for Sequence Labeling},
|
155 |
+
author={Akbik, Alan and Blythe, Duncan and Vollgraf, Roland},
|
156 |
+
booktitle = {{COLING} 2018, 27th International Conference on Computational Linguistics},
|
157 |
+
pages = {1638--1649},
|
158 |
+
year = {2018}
|
159 |
+
}
|
160 |
+
```
|
161 |
+
|
162 |
+
---
|
163 |
+
|
164 |
+
### Issues?
|
165 |
+
|
166 |
+
The Flair issue tracker is available [here](https://github.com/flairNLP/flair/issues/).
|
loss.tsv
ADDED
@@ -0,0 +1,151 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
EPOCH TIMESTAMP BAD_EPOCHS LEARNING_RATE TRAIN_LOSS TRAIN_PRECISION TRAIN_RECALL TRAIN_ACCURACY TRAIN_F-SCORE DEV_LOSS DEV_PRECISION DEV_RECALL DEV_ACCURACY DEV_F-SCORE TEST_LOSS TEST_PRECISION TEST_RECALL TEST_ACCURACY TEST_F-SCORE
|
2 |
+
0 11:12:55 0 0.1000 0.7210021345162283 _ _ _ _ _ _ _ _ _ _ 0.8481 0.8481 0.8481 0.8481
|
3 |
+
1 11:36:23 0 0.1000 0.4978160696237081 _ _ _ _ _ _ _ _ _ _ 0.878 0.878 0.878 0.878
|
4 |
+
2 11:59:52 0 0.1000 0.43429711483610006 _ _ _ _ _ _ _ _ _ _ 0.8967 0.8967 0.8967 0.8967
|
5 |
+
3 12:23:17 0 0.1000 0.39905918871464724 _ _ _ _ _ _ _ _ _ _ 0.904 0.904 0.904 0.904
|
6 |
+
4 12:46:45 0 0.1000 0.3742375968457236 _ _ _ _ _ _ _ _ _ _ 0.9095 0.9095 0.9095 0.9095
|
7 |
+
5 13:10:21 0 0.1000 0.35342702350133975 _ _ _ _ _ _ _ _ _ _ 0.9169 0.9169 0.9169 0.9169
|
8 |
+
6 13:33:49 0 0.1000 0.33980307603623794 _ _ _ _ _ _ _ _ _ _ 0.9202 0.9202 0.9202 0.9202
|
9 |
+
7 13:57:21 0 0.1000 0.32579335047085706 _ _ _ _ _ _ _ _ _ _ 0.9232 0.9232 0.9232 0.9232
|
10 |
+
8 14:20:44 0 0.1000 0.31598528568207496 _ _ _ _ _ _ _ _ _ _ 0.9276 0.9276 0.9276 0.9276
|
11 |
+
9 14:44:11 0 0.1000 0.30771679594818646 _ _ _ _ _ _ _ _ _ _ 0.9303 0.9303 0.9303 0.9303
|
12 |
+
10 15:07:37 0 0.1000 0.29935787006847003 _ _ _ _ _ _ _ _ _ _ 0.933 0.933 0.933 0.933
|
13 |
+
11 15:31:00 0 0.1000 0.2921474775313901 _ _ _ _ _ _ _ _ _ _ 0.9343 0.9343 0.9343 0.9343
|
14 |
+
12 15:54:34 0 0.1000 0.2871364236691731 _ _ _ _ _ _ _ _ _ _ 0.9354 0.9354 0.9354 0.9354
|
15 |
+
13 16:17:59 0 0.1000 0.28096626128222385 _ _ _ _ _ _ _ _ _ _ 0.9367 0.9367 0.9367 0.9367
|
16 |
+
14 16:41:22 0 0.1000 0.2746626851626277 _ _ _ _ _ _ _ _ _ _ 0.9388 0.9388 0.9388 0.9388
|
17 |
+
15 17:04:53 0 0.1000 0.2702156779567315 _ _ _ _ _ _ _ _ _ _ 0.939 0.939 0.939 0.939
|
18 |
+
16 17:28:14 0 0.1000 0.2677499098394409 _ _ _ _ _ _ _ _ _ _ 0.94 0.94 0.94 0.94
|
19 |
+
17 17:51:41 0 0.1000 0.26327742492058104 _ _ _ _ _ _ _ _ _ _ 0.9414 0.9414 0.9414 0.9414
|
20 |
+
18 18:15:15 0 0.1000 0.25877575904336814 _ _ _ _ _ _ _ _ _ _ 0.9432 0.9432 0.9432 0.9432
|
21 |
+
19 18:38:40 0 0.1000 0.2537475148621729 _ _ _ _ _ _ _ _ _ _ 0.9429 0.9429 0.9429 0.9429
|
22 |
+
20 19:02:06 0 0.1000 0.2526139328870411 _ _ _ _ _ _ _ _ _ _ 0.9433 0.9433 0.9433 0.9433
|
23 |
+
21 19:25:36 0 0.1000 0.25058489056542393 _ _ _ _ _ _ _ _ _ _ 0.9454 0.9454 0.9454 0.9454
|
24 |
+
22 19:49:01 0 0.1000 0.24793850796675693 _ _ _ _ _ _ _ _ _ _ 0.9459 0.9459 0.9459 0.9459
|
25 |
+
23 20:12:32 0 0.1000 0.24369133532522563 _ _ _ _ _ _ _ _ _ _ 0.9458 0.9458 0.9458 0.9458
|
26 |
+
24 20:35:57 0 0.1000 0.24157939653927565 _ _ _ _ _ _ _ _ _ _ 0.9464 0.9464 0.9464 0.9464
|
27 |
+
25 20:59:37 0 0.1000 0.23970980893528734 _ _ _ _ _ _ _ _ _ _ 0.9477 0.9477 0.9477 0.9477
|
28 |
+
26 21:23:02 0 0.1000 0.23712054908323255 _ _ _ _ _ _ _ _ _ _ 0.9474 0.9474 0.9474 0.9474
|
29 |
+
27 21:46:26 0 0.1000 0.2367942676861768 _ _ _ _ _ _ _ _ _ _ 0.9464 0.9464 0.9464 0.9464
|
30 |
+
28 22:09:57 0 0.1000 0.2333034627188289 _ _ _ _ _ _ _ _ _ _ 0.9481 0.9481 0.9481 0.9481
|
31 |
+
29 22:33:23 0 0.1000 0.23126234505920054 _ _ _ _ _ _ _ _ _ _ 0.949 0.949 0.949 0.949
|
32 |
+
30 22:56:56 0 0.1000 0.22932123181550282 _ _ _ _ _ _ _ _ _ _ 0.9484 0.9484 0.9484 0.9484
|
33 |
+
31 23:20:45 0 0.1000 0.2273743192486796 _ _ _ _ _ _ _ _ _ _ 0.9492 0.9492 0.9492 0.9492
|
34 |
+
32 23:44:10 0 0.1000 0.2255119944040062 _ _ _ _ _ _ _ _ _ _ 0.9485 0.9485 0.9485 0.9485
|
35 |
+
33 00:07:34 0 0.1000 0.22409818432363 _ _ _ _ _ _ _ _ _ _ 0.95 0.95 0.95 0.95
|
36 |
+
34 00:30:57 0 0.1000 0.22151199458546297 _ _ _ _ _ _ _ _ _ _ 0.9506 0.9506 0.9506 0.9506
|
37 |
+
35 00:54:24 0 0.1000 0.22155024732969722 _ _ _ _ _ _ _ _ _ _ 0.9512 0.9512 0.9512 0.9512
|
38 |
+
36 01:17:51 1 0.1000 0.2205742845282017 _ _ _ _ _ _ _ _ _ _ 0.9511 0.9511 0.9511 0.9511
|
39 |
+
37 01:41:20 0 0.1000 0.21852846494206418 _ _ _ _ _ _ _ _ _ _ 0.9511 0.9511 0.9511 0.9511
|
40 |
+
38 02:04:56 0 0.1000 0.21719484818274226 _ _ _ _ _ _ _ _ _ _ 0.9519 0.9519 0.9519 0.9519
|
41 |
+
39 02:28:19 0 0.1000 0.21682299653427595 _ _ _ _ _ _ _ _ _ _ 0.9509 0.9509 0.9509 0.9509
|
42 |
+
40 02:51:41 0 0.1000 0.21704343444362736 _ _ _ _ _ _ _ _ _ _ 0.9524 0.9524 0.9524 0.9524
|
43 |
+
41 03:15:08 1 0.1000 0.21260865162828568 _ _ _ _ _ _ _ _ _ _ 0.9526 0.9526 0.9526 0.9526
|
44 |
+
42 03:38:43 0 0.1000 0.21280285771466917 _ _ _ _ _ _ _ _ _ _ 0.9525 0.9525 0.9525 0.9525
|
45 |
+
43 04:02:11 1 0.1000 0.21139400303602057 _ _ _ _ _ _ _ _ _ _ 0.9532 0.9532 0.9532 0.9532
|
46 |
+
44 04:25:50 0 0.1000 0.2103025148200453 _ _ _ _ _ _ _ _ _ _ 0.9534 0.9534 0.9534 0.9534
|
47 |
+
45 04:49:17 0 0.1000 0.21047394277326975 _ _ _ _ _ _ _ _ _ _ 0.954 0.954 0.954 0.954
|
48 |
+
46 05:12:43 1 0.1000 0.20889954569228175 _ _ _ _ _ _ _ _ _ _ 0.9538 0.9538 0.9538 0.9538
|
49 |
+
47 05:36:11 0 0.1000 0.2081116257508449 _ _ _ _ _ _ _ _ _ _ 0.9534 0.9534 0.9534 0.9534
|
50 |
+
48 05:59:47 0 0.1000 0.20735290020180713 _ _ _ _ _ _ _ _ _ _ 0.9544 0.9544 0.9544 0.9544
|
51 |
+
49 06:23:19 0 0.1000 0.2051314154268594 _ _ _ _ _ _ _ _ _ _ 0.9545 0.9545 0.9545 0.9545
|
52 |
+
50 06:46:47 0 0.1000 0.20549383197211374 _ _ _ _ _ _ _ _ _ _ 0.9542 0.9542 0.9542 0.9542
|
53 |
+
51 07:10:30 1 0.1000 0.20552540715422096 _ _ _ _ _ _ _ _ _ _ 0.9541 0.9541 0.9541 0.9541
|
54 |
+
52 07:34:01 2 0.1000 0.2022382171174598 _ _ _ _ _ _ _ _ _ _ 0.9545 0.9545 0.9545 0.9545
|
55 |
+
53 07:57:33 0 0.1000 0.2034686032817135 _ _ _ _ _ _ _ _ _ _ 0.955 0.955 0.955 0.955
|
56 |
+
54 08:20:55 1 0.1000 0.2022125936710602 _ _ _ _ _ _ _ _ _ _ 0.9547 0.9547 0.9547 0.9547
|
57 |
+
55 08:44:21 0 0.1000 0.20195547892638116 _ _ _ _ _ _ _ _ _ _ 0.9548 0.9548 0.9548 0.9548
|
58 |
+
56 09:07:58 0 0.1000 0.2010047433371816 _ _ _ _ _ _ _ _ _ _ 0.9549 0.9549 0.9549 0.9549
|
59 |
+
57 09:31:27 0 0.1000 0.20029620328622608 _ _ _ _ _ _ _ _ _ _ 0.9554 0.9554 0.9554 0.9554
|
60 |
+
58 09:55:07 0 0.1000 0.200410623315139 _ _ _ _ _ _ _ _ _ _ 0.9555 0.9555 0.9555 0.9555
|
61 |
+
59 10:18:35 1 0.1000 0.19885549961224236 _ _ _ _ _ _ _ _ _ _ 0.9559 0.9559 0.9559 0.9559
|
62 |
+
60 10:42:00 0 0.1000 0.1969310509000975 _ _ _ _ _ _ _ _ _ _ 0.9562 0.9562 0.9562 0.9562
|
63 |
+
61 11:05:28 0 0.1000 0.19752888845380348 _ _ _ _ _ _ _ _ _ _ 0.956 0.956 0.956 0.956
|
64 |
+
62 11:28:57 1 0.1000 0.19763092688816444 _ _ _ _ _ _ _ _ _ _ 0.9562 0.9562 0.9562 0.9562
|
65 |
+
63 11:52:29 2 0.1000 0.19646054708820582 _ _ _ _ _ _ _ _ _ _ 0.9561 0.9561 0.9561 0.9561
|
66 |
+
64 12:16:10 0 0.1000 0.19479624992381397 _ _ _ _ _ _ _ _ _ _ 0.9568 0.9568 0.9568 0.9568
|
67 |
+
65 12:39:43 0 0.1000 0.19464160452930918 _ _ _ _ _ _ _ _ _ _ 0.9566 0.9566 0.9566 0.9566
|
68 |
+
66 13:03:14 0 0.1000 0.19548884070323036 _ _ _ _ _ _ _ _ _ _ 0.9568 0.9568 0.9568 0.9568
|
69 |
+
67 13:26:40 1 0.1000 0.19475973537225158 _ _ _ _ _ _ _ _ _ _ 0.9569 0.9569 0.9569 0.9569
|
70 |
+
68 13:50:07 2 0.1000 0.19439746426108423 _ _ _ _ _ _ _ _ _ _ 0.9563 0.9563 0.9563 0.9563
|
71 |
+
69 14:13:32 0 0.1000 0.19370671081686655 _ _ _ _ _ _ _ _ _ _ 0.9564 0.9564 0.9564 0.9564
|
72 |
+
70 14:36:59 0 0.1000 0.19183528371686911 _ _ _ _ _ _ _ _ _ _ 0.9576 0.9576 0.9576 0.9576
|
73 |
+
71 15:00:41 0 0.1000 0.19287951452775143 _ _ _ _ _ _ _ _ _ _ 0.9572 0.9572 0.9572 0.9572
|
74 |
+
72 15:24:07 1 0.1000 0.19141927684724075 _ _ _ _ _ _ _ _ _ _ 0.9576 0.9576 0.9576 0.9576
|
75 |
+
73 15:47:38 0 0.1000 0.19037402588275787 _ _ _ _ _ _ _ _ _ _ 0.957 0.957 0.957 0.957
|
76 |
+
74 16:11:07 0 0.1000 0.19046012064034978 _ _ _ _ _ _ _ _ _ _ 0.9571 0.9571 0.9571 0.9571
|
77 |
+
75 16:34:34 1 0.1000 0.1903303293783702 _ _ _ _ _ _ _ _ _ _ 0.9572 0.9572 0.9572 0.9572
|
78 |
+
76 16:58:12 0 0.1000 0.19066792594619927 _ _ _ _ _ _ _ _ _ _ 0.9573 0.9573 0.9573 0.9573
|
79 |
+
77 17:22:06 1 0.1000 0.18952273385669854 _ _ _ _ _ _ _ _ _ _ 0.9573 0.9573 0.9573 0.9573
|
80 |
+
78 17:45:42 0 0.1000 0.18825725826713122 _ _ _ _ _ _ _ _ _ _ 0.9579 0.9579 0.9579 0.9579
|
81 |
+
79 18:09:17 0 0.1000 0.18935545475904278 _ _ _ _ _ _ _ _ _ _ 0.958 0.958 0.958 0.958
|
82 |
+
80 18:32:52 1 0.1000 0.1879875679614127 _ _ _ _ _ _ _ _ _ _ 0.958 0.958 0.958 0.958
|
83 |
+
81 18:56:27 0 0.1000 0.18778680214824128 _ _ _ _ _ _ _ _ _ _ 0.9584 0.9584 0.9584 0.9584
|
84 |
+
82 19:20:01 0 0.1000 0.18746448831188942 _ _ _ _ _ _ _ _ _ _ 0.9585 0.9585 0.9585 0.9585
|
85 |
+
83 19:43:32 0 0.1000 0.18597292849562683 _ _ _ _ _ _ _ _ _ _ 0.9578 0.9578 0.9578 0.9578
|
86 |
+
84 20:07:22 0 0.1000 0.1860005068718633 _ _ _ _ _ _ _ _ _ _ 0.9582 0.9582 0.9582 0.9582
|
87 |
+
85 20:30:58 1 0.1000 0.18633582194462925 _ _ _ _ _ _ _ _ _ _ 0.9583 0.9583 0.9583 0.9583
|
88 |
+
86 20:54:31 2 0.1000 0.18582389025507537 _ _ _ _ _ _ _ _ _ _ 0.9584 0.9584 0.9584 0.9584
|
89 |
+
87 21:18:02 0 0.1000 0.18583678361469438 _ _ _ _ _ _ _ _ _ _ 0.9584 0.9584 0.9584 0.9584
|
90 |
+
88 21:41:34 1 0.1000 0.18316908685788577 _ _ _ _ _ _ _ _ _ _ 0.9586 0.9586 0.9586 0.9586
|
91 |
+
89 22:05:12 0 0.1000 0.18534635613074252 _ _ _ _ _ _ _ _ _ _ 0.958 0.958 0.958 0.958
|
92 |
+
90 22:28:54 1 0.1000 0.18384256521973166 _ _ _ _ _ _ _ _ _ _ 0.9587 0.9587 0.9587 0.9587
|
93 |
+
91 22:52:27 2 0.1000 0.18405010121052887 _ _ _ _ _ _ _ _ _ _ 0.9588 0.9588 0.9588 0.9588
|
94 |
+
92 23:15:59 0 0.0500 0.17556561555019948 _ _ _ _ _ _ _ _ _ _ 0.9603 0.9603 0.9603 0.9603
|
95 |
+
93 23:39:32 0 0.0500 0.16994592318872956 _ _ _ _ _ _ _ _ _ _ 0.9603 0.9603 0.9603 0.9603
|
96 |
+
94 00:03:02 0 0.0500 0.16775754866871675 _ _ _ _ _ _ _ _ _ _ 0.9605 0.9605 0.9605 0.9605
|
97 |
+
95 00:26:30 0 0.0500 0.167797892421305 _ _ _ _ _ _ _ _ _ _ 0.9608 0.9608 0.9608 0.9608
|
98 |
+
96 00:50:08 1 0.0500 0.1654042344597402 _ _ _ _ _ _ _ _ _ _ 0.961 0.961 0.961 0.961
|
99 |
+
97 01:13:54 0 0.0500 0.1638417892170097 _ _ _ _ _ _ _ _ _ _ 0.9611 0.9611 0.9611 0.9611
|
100 |
+
98 01:37:21 0 0.0500 0.1628716704711418 _ _ _ _ _ _ _ _ _ _ 0.9615 0.9615 0.9615 0.9615
|
101 |
+
99 02:00:54 0 0.0500 0.16135923191805748 _ _ _ _ _ _ _ _ _ _ 0.9614 0.9614 0.9614 0.9614
|
102 |
+
100 02:24:23 0 0.0500 0.16102960881517717 _ _ _ _ _ _ _ _ _ _ 0.9614 0.9614 0.9614 0.9614
|
103 |
+
101 02:47:58 0 0.0500 0.1597511584858509 _ _ _ _ _ _ _ _ _ _ 0.9612 0.9612 0.9612 0.9612
|
104 |
+
102 03:11:33 0 0.0500 0.158750009230552 _ _ _ _ _ _ _ _ _ _ 0.9613 0.9613 0.9613 0.9613
|
105 |
+
103 03:35:16 0 0.0500 0.1598144823844623 _ _ _ _ _ _ _ _ _ _ 0.9617 0.9617 0.9617 0.9617
|
106 |
+
104 03:58:47 1 0.0500 0.15843389315256012 _ _ _ _ _ _ _ _ _ _ 0.962 0.962 0.962 0.962
|
107 |
+
105 04:22:18 0 0.0500 0.1582850175641563 _ _ _ _ _ _ _ _ _ _ 0.9619 0.9619 0.9619 0.9619
|
108 |
+
106 04:45:55 0 0.0500 0.15683828063503616 _ _ _ _ _ _ _ _ _ _ 0.9619 0.9619 0.9619 0.9619
|
109 |
+
107 05:09:22 0 0.0500 0.15554793214376686 _ _ _ _ _ _ _ _ _ _ 0.962 0.962 0.962 0.962
|
110 |
+
108 05:32:50 0 0.0500 0.15559760241105475 _ _ _ _ _ _ _ _ _ _ 0.9618 0.9618 0.9618 0.9618
|
111 |
+
109 05:56:18 1 0.0500 0.15592738710675413 _ _ _ _ _ _ _ _ _ _ 0.9619 0.9619 0.9619 0.9619
|
112 |
+
110 06:19:57 2 0.0500 0.1559649915345551 _ _ _ _ _ _ _ _ _ _ 0.9625 0.9625 0.9625 0.9625
|
113 |
+
111 06:43:28 0 0.0250 0.15047709863006958 _ _ _ _ _ _ _ _ _ _ 0.9626 0.9626 0.9626 0.9626
|
114 |
+
112 07:06:53 0 0.0250 0.14901342020445227 _ _ _ _ _ _ _ _ _ _ 0.9626 0.9626 0.9626 0.9626
|
115 |
+
113 07:30:23 0 0.0250 0.14785490136836343 _ _ _ _ _ _ _ _ _ _ 0.9629 0.9629 0.9629 0.9629
|
116 |
+
114 07:53:54 0 0.0250 0.14793952625254247 _ _ _ _ _ _ _ _ _ _ 0.963 0.963 0.963 0.963
|
117 |
+
115 08:17:22 1 0.0250 0.14614440136844034 _ _ _ _ _ _ _ _ _ _ 0.9629 0.9629 0.9629 0.9629
|
118 |
+
116 08:41:03 0 0.0250 0.14456831456853306 _ _ _ _ _ _ _ _ _ _ 0.9631 0.9631 0.9631 0.9631
|
119 |
+
117 09:04:35 0 0.0250 0.1456362626140971 _ _ _ _ _ _ _ _ _ _ 0.9631 0.9631 0.9631 0.9631
|
120 |
+
118 09:28:00 1 0.0250 0.1439011391959644 _ _ _ _ _ _ _ _ _ _ 0.9631 0.9631 0.9631 0.9631
|
121 |
+
119 09:51:25 0 0.0250 0.14387666936879415 _ _ _ _ _ _ _ _ _ _ 0.9632 0.9632 0.9632 0.9632
|
122 |
+
120 10:14:51 0 0.0250 0.14368277304316643 _ _ _ _ _ _ _ _ _ _ 0.9633 0.9633 0.9633 0.9633
|
123 |
+
121 10:38:15 0 0.0250 0.14401180188408505 _ _ _ _ _ _ _ _ _ _ 0.9632 0.9632 0.9632 0.9632
|
124 |
+
122 11:01:45 1 0.0250 0.14190384826544228 _ _ _ _ _ _ _ _ _ _ 0.9632 0.9632 0.9632 0.9632
|
125 |
+
123 11:25:31 0 0.0250 0.14156404113846752 _ _ _ _ _ _ _ _ _ _ 0.9633 0.9633 0.9633 0.9633
|
126 |
+
124 11:49:02 0 0.0250 0.14334506448280024 _ _ _ _ _ _ _ _ _ _ 0.9635 0.9635 0.9635 0.9635
|
127 |
+
125 12:12:31 1 0.0250 0.1419388082834431 _ _ _ _ _ _ _ _ _ _ 0.9634 0.9634 0.9634 0.9634
|
128 |
+
126 12:36:00 2 0.0250 0.14170677811496898 _ _ _ _ _ _ _ _ _ _ 0.9636 0.9636 0.9636 0.9636
|
129 |
+
127 12:59:27 0 0.0125 0.13874942668664622 _ _ _ _ _ _ _ _ _ _ 0.9638 0.9638 0.9638 0.9638
|
130 |
+
128 13:22:56 0 0.0125 0.13761071341297706 _ _ _ _ _ _ _ _ _ _ 0.9639 0.9639 0.9639 0.9639
|
131 |
+
129 13:46:24 0 0.0125 0.13801613958699138 _ _ _ _ _ _ _ _ _ _ 0.9639 0.9639 0.9639 0.9639
|
132 |
+
130 14:10:12 1 0.0125 0.13691942280057715 _ _ _ _ _ _ _ _ _ _ 0.964 0.964 0.964 0.964
|
133 |
+
131 14:33:48 0 0.0125 0.13759175252124392 _ _ _ _ _ _ _ _ _ _ 0.964 0.964 0.964 0.964
|
134 |
+
132 14:57:26 1 0.0125 0.13605770421426944 _ _ _ _ _ _ _ _ _ _ 0.964 0.964 0.964 0.964
|
135 |
+
133 15:20:56 0 0.0125 0.13757905333765666 _ _ _ _ _ _ _ _ _ _ 0.9641 0.9641 0.9641 0.9641
|
136 |
+
134 15:44:27 1 0.0125 0.13585668101601625 _ _ _ _ _ _ _ _ _ _ 0.964 0.964 0.964 0.964
|
137 |
+
135 16:07:57 0 0.0125 0.1359525140283796 _ _ _ _ _ _ _ _ _ _ 0.9641 0.9641 0.9641 0.9641
|
138 |
+
136 16:31:41 1 0.0125 0.1353510881129821 _ _ _ _ _ _ _ _ _ _ 0.9642 0.9642 0.9642 0.9642
|
139 |
+
137 16:55:09 0 0.0125 0.13588478053671538 _ _ _ _ _ _ _ _ _ _ 0.9641 0.9641 0.9641 0.9641
|
140 |
+
138 17:18:45 1 0.0125 0.13547475301437173 _ _ _ _ _ _ _ _ _ _ 0.9642 0.9642 0.9642 0.9642
|
141 |
+
139 17:42:22 2 0.0125 0.13502782606054411 _ _ _ _ _ _ _ _ _ _ 0.9642 0.9642 0.9642 0.9642
|
142 |
+
140 18:05:53 0 0.0125 0.13481195497481782 _ _ _ _ _ _ _ _ _ _ 0.9641 0.9641 0.9641 0.9641
|
143 |
+
141 18:29:30 0 0.0125 0.13416481617924972 _ _ _ _ _ _ _ _ _ _ 0.9645 0.9645 0.9645 0.9645
|
144 |
+
142 18:53:03 0 0.0125 0.13502357041225113 _ _ _ _ _ _ _ _ _ _ 0.9643 0.9643 0.9643 0.9643
|
145 |
+
143 19:16:51 1 0.0125 0.1348685677112196 _ _ _ _ _ _ _ _ _ _ 0.9643 0.9643 0.9643 0.9643
|
146 |
+
144 19:40:23 2 0.0125 0.13317074967193576 _ _ _ _ _ _ _ _ _ _ 0.9645 0.9645 0.9645 0.9645
|
147 |
+
145 20:03:53 0 0.0125 0.13302220075108664 _ _ _ _ _ _ _ _ _ _ 0.9644 0.9644 0.9644 0.9644
|
148 |
+
146 20:27:21 0 0.0125 0.13223227358080888 _ _ _ _ _ _ _ _ _ _ 0.9644 0.9644 0.9644 0.9644
|
149 |
+
147 20:50:55 0 0.0125 0.13367648299557353 _ _ _ _ _ _ _ _ _ _ 0.9644 0.9644 0.9644 0.9644
|
150 |
+
148 21:14:30 1 0.0125 0.13366286128920055 _ _ _ _ _ _ _ _ _ _ 0.9644 0.9644 0.9644 0.9644
|
151 |
+
149 21:38:15 2 0.0125 0.13307078396670677 _ _ _ _ _ _ _ _ _ _ 0.9641 0.9641 0.9641 0.9641
|
pytorch_model.bin
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:3402226c5babc6cd0b34fc0a16d05528603634255fec03d455621f6898a9433c
|
3 |
+
size 314055714
|
training.log
ADDED
The diff for this file is too large to render.
See raw diff
|
|