Fill-Mask
Transformers
PyTorch
Japanese
bert
Inference Endpoints
aken12 commited on
Commit
e3af493
1 Parent(s): e58d212

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +18 -16
README.md CHANGED
@@ -9,22 +9,6 @@ language:
9
 
10
 
11
 
12
- | | | | JQaRa | | |
13
- | ------------------- | --- | --------- | --------- | --------- | --------- |
14
- | | | NDCG@10 | MRR@10 | NDCG@100 | MRR@100 |
15
- | splade-japanese-v3 | | 0.505 | 0.772 | 0.7 | 0.775 |
16
- | JaColBERTv2 | | 0.585 | 0.836 | 0.753 | 0.838 |
17
- | JaColBERT | | 0.549 | 0.811 | 0.730 | 0.814 |
18
- | bge-m3+all | | 0.576 | 0.818 | 0.745 | 0.820 |
19
- | bg3-m3+dense | | 0.539 | 0.785 | 0.721 | 0.788 |
20
- | m-e5-large | | 0.554 | 0.799 | 0.731 | 0.801 |
21
- | m-e5-base | | 0.471 | 0.727 | 0.673 | 0.731 |
22
- | m-e5-small | | 0.492 | 0.729 | 0.689 | 0.733 |
23
- | GLuCoSE | | 0.308 | 0.518 | 0.564 | 0.527 |
24
- | sup-simcse-ja-base | | 0.324 | 0.541 | 0.572 | 0.550 |
25
- | sup-simcse-ja-large | | 0.356 | 0.575 | 0.596 | 0.583 |
26
- | fio-base-v0.1 | | 0.372 | 0.616 | 0.608 | 0.622 |
27
-
28
  ## Evaluation on [MIRACL japanese](https://huggingface.co/datasets/miracl/miracl)
29
  These models don't train on the MIRACL training data.
30
 
@@ -41,6 +25,24 @@ These models don't train on the MIRACL training data.
41
  *'splade-japanese-v2-doc' model does not require query encoder during inference.
42
 
43
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
44
 
45
  下のコードを実行すれば,単語拡張や重み付けの確認ができます.
46
 
 
9
 
10
 
11
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
12
  ## Evaluation on [MIRACL japanese](https://huggingface.co/datasets/miracl/miracl)
13
  These models don't train on the MIRACL training data.
14
 
 
25
  *'splade-japanese-v2-doc' model does not require query encoder during inference.
26
 
27
 
28
+ ## Evaluation on [hotchpotch/JQaRA](https://huggingface.co/datasets/hotchpotch/JQaRA)
29
+
30
+ | | | | JQaRa | | |
31
+ | ------------------- | --- | --------- | --------- | --------- | --------- |
32
+ | | | NDCG@10 | MRR@10 | NDCG@100 | MRR@100 |
33
+ | splade-japanese-v3 | | 0.505 | 0.772 | 0.7 | 0.775 |
34
+ | JaColBERTv2 | | 0.585 | 0.836 | 0.753 | 0.838 |
35
+ | JaColBERT | | 0.549 | 0.811 | 0.730 | 0.814 |
36
+ | bge-m3+all | | 0.576 | 0.818 | 0.745 | 0.820 |
37
+ | bg3-m3+dense | | 0.539 | 0.785 | 0.721 | 0.788 |
38
+ | m-e5-large | | 0.554 | 0.799 | 0.731 | 0.801 |
39
+ | m-e5-base | | 0.471 | 0.727 | 0.673 | 0.731 |
40
+ | m-e5-small | | 0.492 | 0.729 | 0.689 | 0.733 |
41
+ | GLuCoSE | | 0.308 | 0.518 | 0.564 | 0.527 |
42
+ | sup-simcse-ja-base | | 0.324 | 0.541 | 0.572 | 0.550 |
43
+ | sup-simcse-ja-large | | 0.356 | 0.575 | 0.596 | 0.583 |
44
+ | fio-base-v0.1 | | 0.372 | 0.616 | 0.608 | 0.622 |
45
+
46
 
47
  下のコードを実行すれば,単語拡張や重み付けの確認ができます.
48