Safetensors
Korean
new
reranker
korean
custom_code
sigridjineth commited on
Commit
d5dfd00
·
verified ·
1 Parent(s): 136738b

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +81 -0
README.md ADDED
@@ -0,0 +1,81 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Model Card: sigridjineth/ko-reranker-v1.1-preview
2
+
3
+ **Note: This is a preview release.** The model is currently under development and may undergo further changes as we refine and improve its performance.
4
+
5
+ ## Overview
6
+
7
+ **sigridjineth/ko-reranker-v1.1-preview** is an advanced Korean reranker fine-tuned to excel in understanding Korean text and delivering high-quality, context-aware relevance scores. This model builds upon [Alibaba-NLP/gte-multilingual-reranker-base](https://huggingface.co/Alibaba-NLP/gte-multilingual-reranker-base) and employs sophisticated techniques such as hard negative mining and teacher-student distillation to achieve robust performance.
8
+
9
+ ## Training Data
10
+
11
+ We leveraged [sigridjineth/korean_nli_dataset_reranker_v0](https://huggingface.co/datasets/sigridjineth/korean_nli_dataset_reranker_v0) as the core training resource. This dataset itself is composed of multiple publicly available datasets:
12
+
13
+ - **kor_nli (train)**: [https://huggingface.co/datasets/kor_nli](https://huggingface.co/datasets/kor_nli)
14
+ - **mnli_ko (train)**: [https://huggingface.co/datasets/kozistr/mnli_ko](https://huggingface.co/datasets/kozistr/mnli_ko)
15
+ - **ko-wiki-reranking (train)**: [https://huggingface.co/datasets/upskyy/ko-wiki-reranking](https://huggingface.co/datasets/upskyy/ko-wiki-reranking)
16
+ - **mr_tydi_korean (train)**: [https://huggingface.co/datasets/castorini/mr-tydi](https://huggingface.co/datasets/castorini/mr-tydi)
17
+ - **klue_nli (train)**: [https://huggingface.co/datasets/klue/klue](https://huggingface.co/datasets/klue/klue)
18
+
19
+ These combined resources ensure coverage across a wide range of topics, styles, and complexities in Korean language data, providing the model with the necessary diversity to handle various linguistic nuances.
20
+
21
+ ## Key Features
22
+
23
+ - **Hard Negative Mining**:
24
+ Utilized [BAAI/bge-m3](https://huggingface.co/BAAI/bge-m3) to extract challenging negatives. This approach enriches the training set with more difficult contrasts, enhancing model robustness and refinement.
25
+
26
+ - **Teacher Distillation**:
27
+ Employed [BAAI/bge-reranker-v2.5-gemma2-lightweight](https://huggingface.co/BAAI/bge-reranker-v2.5-gemma2-lightweight) as a teacher model. The student model learned from teacher-provided signals (positive and negative scores), accelerating convergence and boosting final performance.
28
+
29
+ ## Intended Use
30
+
31
+ This model is well-suited for:
32
+
33
+ - **Search & Information Retrieval**: Improving document ranking for Korean-language search results.
34
+ - **Question Answering (QA)**: Enhancing QA pipelines by reordering candidate answers for better relevance.
35
+ - **Content Recommendation**: Refining results in recommendation engines that rely on textual signals.
36
+
37
+ ## Limitations & Future Work
38
+
39
+ - **Preview Release**:
40
+ As a preview, the model may not be fully optimized. Future releases aim to improve stability and generalization.
41
+
42
+ - **Lack of Evaluation**:
43
+ There is a need to develop the specific benchmark to evaluate the generalized Korean retrieval tasks for rerankers.
44
+
45
+ ## References
46
+
47
+ For more on the methodologies and theories behind multilingual rerankers and text embeddings, we encourage reviewing the following references:
48
+
49
+ ```
50
+ @misc{zhang2024mgtegeneralizedlongcontexttext,
51
+ title={mGTE: Generalized Long-Context Text Representation and Reranking Models for Multilingual Text Retrieval},
52
+ author={Xin Zhang and Yanzhao Zhang and Dingkun Long and Wen Xie and Ziqi Dai and Jialong Tang and Huan Lin and Baosong Yang and Pengjun Xie and Fei Huang and Meishan Zhang and Wenjie Li and Min Zhang},
53
+ year={2024},
54
+ eprint={2407.19669},
55
+ archivePrefix={arXiv},
56
+ primaryClass={cs.CL},
57
+ url={https://arxiv.org/abs/2407.19669},
58
+ }
59
+
60
+ @misc{li2023making,
61
+ title={Making Large Language Models A Better Foundation For Dense Retrieval},
62
+ author={Chaofan Li and Zheng Liu and Shitao Xiao and Yingxia Shao},
63
+ year={2023},
64
+ eprint={2312.15503},
65
+ archivePrefix={arXiv},
66
+ primaryClass={cs.CL}
67
+ }
68
+
69
+ @misc{chen2024bge,
70
+ title={BGE M3-Embedding: Multi-Lingual, Multi-Functionality, Multi-Granularity Text Embeddings Through Self-Knowledge Distillation},
71
+ author={Jianlv Chen and Shitao Xiao and Peitian Zhang and Kun Luo and Defu Lian and Zheng Liu},
72
+ year={2024},
73
+ eprint={2402.03216},
74
+ archivePrefix={arXiv},
75
+ primaryClass={cs.CL}
76
+ }
77
+ ```
78
+
79
+ ## Contact & Feedback
80
+
81
+ We welcome constructive feedback, suggestions, and contributions. For improvements or inquiries, please reach out via GitHub issues or join the Hugging Face Discussions. We’re committed to continuous iteration and making **sigridjineth/ko-reranker-v1.1-preview** the go-to solution for Korean text reranking.