hamishivi commited on
Commit
d552117
1 Parent(s): cac9a11

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +10 -7
README.md CHANGED
@@ -17,7 +17,9 @@ base_model: meta-llama/Llama-2-7b-hf
17
 
18
  This model belongs to the Tulu series of models, which is a series of language models that are trained to act as helpful assistants.
19
  Open Instruct ShareGPT Llama2 7B is a fine-tuned version of Llama 2 that was trained on the [ShareGPT dataset](https://huggingface.co/datasets/anon8231489123/ShareGPT_Vicuna_unfiltered).
20
- Please check out our paper [TODO] for more!
 
 
21
 
22
 
23
  ## Model description
@@ -106,12 +108,13 @@ The following hyperparameters were used during DPO training:
106
  If you find this model is useful in your work, please cite it with:
107
 
108
  ```
109
- @misc{ivison2023changing,
110
- title={Camels in a Changing Climate: Enhancing LM Adaptation with Tulu 2},
111
- author={Hamish Ivison and Yizhong Wang and Valentina Pyatkin and Nathan Lambert and Matthew Peters and Pradeep Dasigi and Joel Jang and David Wadden and Noah A. Smith and Iz Beltagy and Hannaneh Hajishirzi},
112
- year={2023},
113
- archivePrefix={arXiv},
114
- primaryClass={cs.CL}
 
115
  }
116
  ```
117
 
 
17
 
18
  This model belongs to the Tulu series of models, which is a series of language models that are trained to act as helpful assistants.
19
  Open Instruct ShareGPT Llama2 7B is a fine-tuned version of Llama 2 that was trained on the [ShareGPT dataset](https://huggingface.co/datasets/anon8231489123/ShareGPT_Vicuna_unfiltered).
20
+
21
+ For more details, read the paper: [Camels in a Changing Climate: Enhancing LM Adaptation with Tulu 2
22
+ ](https://arxiv.org/abs/2311.10702).
23
 
24
 
25
  ## Model description
 
108
  If you find this model is useful in your work, please cite it with:
109
 
110
  ```
111
+ @misc{ivison2023camels,
112
+ title={Camels in a Changing Climate: Enhancing LM Adaptation with Tulu 2},
113
+ author={Hamish Ivison and Yizhong Wang and Valentina Pyatkin and Nathan Lambert and Matthew Peters and Pradeep Dasigi and Joel Jang and David Wadden and Noah A. Smith and Iz Beltagy and Hannaneh Hajishirzi},
114
+ year={2023},
115
+ eprint={2311.10702},
116
+ archivePrefix={arXiv},
117
+ primaryClass={cs.CL}
118
  }
119
  ```
120