--- inference: false language: - en tags: - instruction-finetuning task_categories: - text-generation ---
# xFinder-qwen1505 ## Model Details xFinder-qwen1505 is a model specifically designed for key answer extraction in large language models (LLMs). It is trained by fine-tuning Qwen-1.5-0.5B. - **Developed by:** [IAAR](https://www.iaar.ac.cn) - **Fine-tuned from Model:** [Qwen-1.5-0.5B](https://huggingface.co/Qwen/Qwen1.5-0.5B) ## Model Sources - **Repository:** https://github.com/IAAR-Shanghai/xFinder - **Paper:** https://arxiv.org/abs/2405.11874 ## Uses xFinder is primarily used to enhance the evaluation of LLMs by accurately extracting key answers from their outputs. It addresses the limitations of traditional regular expression (RegEx)-based extraction methods, which often fail to handle the diverse and complex outputs generated by LLMs. xFinder improves the reliability of model assessments across various tasks. ## Training Details xFinder-qwen1505 is fine-tuned from Qwen-1.5-0.5B. The training data consists of approximately 26.9K samples from the Key Answer Finder (KAF) dataset. This dataset is designed to enhance the accuracy and robustness of key answer extraction and includes a variety of tasks. It has been meticulously annotated by GPT-4 and human experts to ensure high-quality training and evaluation. For more details, see this [paper](https://arxiv.org/abs/2405.11874) and try it with [code](https://github.com/IAAR-Shanghai/xFinder). ## Evaluation xFinder is evaluated on the fully human-annotated test and generalization sets of the KAF dataset. The results demonstrate significant improvements in extraction accuracy and robustness compared to traditional methods. For more details, please refer to the paper and try it out using the provided code. ## Citation ``` @article{xFinder, title={xFinder: Robust and Pinpoint Answer Extraction for Large Language Models}, author={Qingchen Yu and Zifan Zheng and Shichao Song and Zhiyu Li and Feiyu Xiong and Bo Tang and Ding Chen}, journal={arXiv preprint arXiv:2405.11874}, year={2024}, } ```