hjb
commited on
Commit
•
7fe192b
1
Parent(s):
f815645
Update
Browse files- README.md +1 -1
- config.json +20 -20
- pytorch_model.bin → test_pytorch_model.bin +0 -0
README.md
CHANGED
@@ -13,7 +13,7 @@ metrics:
|
|
13 |
- f1
|
14 |
---
|
15 |
|
16 |
-
# Ælæctra - Finetuned for Named Entity Recognition on the [DaNE dataset](https://danlp.alexandra.dk/304bd159d5de/datasets/ddt.zip) (Hvingelby et al., 2020).
|
17 |
**Ælæctra** is a Danish Transformer-based language model created to enhance the variety of Danish NLP resources with a more efficient model compared to previous state-of-the-art (SOTA) models.
|
18 |
|
19 |
Ælæctra was pretrained with the ELECTRA-Small (Clark et al., 2020) pretraining approach by using the Danish Gigaword Corpus (Strømberg-Derczynski et al., 2020) and evaluated on Named Entity Recognition (NER) tasks. Since NER only presents a limited picture of Ælæctra's capabilities I am very interested in further evaluations. Therefore, if you employ it for any task, feel free to hit me up your findings!
|
|
|
13 |
- f1
|
14 |
---
|
15 |
|
16 |
+
# Ælæctra - Finetuned for Named Entity Recognition on the [DaNE dataset](https://danlp.alexandra.dk/304bd159d5de/datasets/ddt.zip) (Hvingelby et al., 2020) by Malte Højmark-Bertelsen.
|
17 |
**Ælæctra** is a Danish Transformer-based language model created to enhance the variety of Danish NLP resources with a more efficient model compared to previous state-of-the-art (SOTA) models.
|
18 |
|
19 |
Ælæctra was pretrained with the ELECTRA-Small (Clark et al., 2020) pretraining approach by using the Danish Gigaword Corpus (Strømberg-Derczynski et al., 2020) and evaluated on Named Entity Recognition (NER) tasks. Since NER only presents a limited picture of Ælæctra's capabilities I am very interested in further evaluations. Therefore, if you employ it for any task, feel free to hit me up your findings!
|
config.json
CHANGED
@@ -9,30 +9,30 @@
|
|
9 |
"hidden_dropout_prob": 0.1,
|
10 |
"hidden_size": 256,
|
11 |
"id2label": {
|
12 |
-
"0": "
|
13 |
-
"1": "
|
14 |
-
"2": "
|
15 |
-
"3": "
|
16 |
-
"4": "
|
17 |
-
"5": "
|
18 |
-
"6": "
|
19 |
-
"7": "
|
20 |
-
"8": "
|
21 |
-
"9": "
|
22 |
},
|
23 |
"initializer_range": 0.02,
|
24 |
"intermediate_size": 1024,
|
25 |
"label2id": {
|
26 |
-
"
|
27 |
-
"
|
28 |
-
"
|
29 |
-
"
|
30 |
-
"
|
31 |
-
"
|
32 |
-
"
|
33 |
-
"
|
34 |
-
"
|
35 |
-
"
|
36 |
},
|
37 |
"layer_norm_eps": 1e-12,
|
38 |
"max_position_embeddings": 512,
|
|
|
9 |
"hidden_dropout_prob": 0.1,
|
10 |
"hidden_size": 256,
|
11 |
"id2label": {
|
12 |
+
"0": "B-PER",
|
13 |
+
"1": "I-PER",
|
14 |
+
"2": "B-LOC",
|
15 |
+
"3": "I-LOC",
|
16 |
+
"4": "B-ORG",
|
17 |
+
"5": "I-ORG",
|
18 |
+
"6": "O",
|
19 |
+
"7": "[PAD]",
|
20 |
+
"8": "[CLS]",
|
21 |
+
"9": "[SEP]"
|
22 |
},
|
23 |
"initializer_range": 0.02,
|
24 |
"intermediate_size": 1024,
|
25 |
"label2id": {
|
26 |
+
"B-PER": 0,
|
27 |
+
"I-PER": 1,
|
28 |
+
"B-LOC": 2,
|
29 |
+
"I-LOC": 3,
|
30 |
+
"B-ORG": 4,
|
31 |
+
"I-ORG": 5,
|
32 |
+
"O": 6,
|
33 |
+
"[PAD]": 7,
|
34 |
+
"[CLS]": 8,
|
35 |
+
"[SEP]": 9
|
36 |
},
|
37 |
"layer_norm_eps": 1e-12,
|
38 |
"max_position_embeddings": 512,
|
pytorch_model.bin → test_pytorch_model.bin
RENAMED
File without changes
|