AdaptLLM commited on
Commit
9d23085
1 Parent(s): 71c083f

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +12 -10
README.md CHANGED
@@ -183,16 +183,6 @@ To easily reproduce our results, we have uploaded the filled-in zero/few-shot in
183
 
184
  **Note:** those filled-in instructions are specifically tailored for models before alignment and do NOT fit for the specific data format required for chat models.
185
 
186
- ## Citation
187
- If you find our work helpful, please cite us:
188
- ```bibtex
189
- @article{adaptllm,
190
- title={Adapting large language models via reading comprehension},
191
- author={Cheng, Daixuan and Huang, Shaohan and Wei, Furu},
192
- journal={arXiv preprint arXiv:2309.09530},
193
- year={2023}
194
- }
195
- ```
196
  # [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
197
  Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_AdaptLLM__finance-chat)
198
 
@@ -206,3 +196,15 @@ Detailed results can be found [here](https://huggingface.co/datasets/open-llm-le
206
  |Winogrande (5-shot) |75.69|
207
  |GSM8k (5-shot) |18.80|
208
 
 
 
 
 
 
 
 
 
 
 
 
 
 
183
 
184
  **Note:** those filled-in instructions are specifically tailored for models before alignment and do NOT fit for the specific data format required for chat models.
185
 
 
 
 
 
 
 
 
 
 
 
186
  # [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
187
  Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_AdaptLLM__finance-chat)
188
 
 
196
  |Winogrande (5-shot) |75.69|
197
  |GSM8k (5-shot) |18.80|
198
 
199
+ ## Citation
200
+ If you find our work helpful, please cite us:
201
+ ```bibtex
202
+ @inproceedings{
203
+ cheng2024adapting,
204
+ title={Adapting Large Language Models via Reading Comprehension},
205
+ author={Daixuan Cheng and Shaohan Huang and Furu Wei},
206
+ booktitle={The Twelfth International Conference on Learning Representations},
207
+ year={2024},
208
+ url={https://openreview.net/forum?id=y886UXPEZ0}
209
+ }
210
+ ```