hamishivi commited on
Commit
4149a29
1 Parent(s): 466f77b

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +11 -7
README.md CHANGED
@@ -18,7 +18,10 @@ base_model: meta-llama/Llama-2-7b-hf
18
  Tulu is a series of language models that are trained to act as helpful assistants.
19
  Tulu 1 llama2 7B is a fine-tuned version of Llama 2 that was trained on a mix of publicly available, synthetic and human datasets.
20
  Specifically, this model is trained on our v1 Tulu data mixture.
21
- Check out our paper [TODO: link]() for more details!
 
 
 
22
 
23
 
24
  ## Model description
@@ -118,12 +121,13 @@ If you use this model, please cite the original Tulu work:
118
  If you find Tulu 2 is useful in your work, please cite it with:
119
 
120
  ```
121
- @misc{ivison2023changing,
122
- title={Camels in a Changing Climate: Enhancing LM Adaptation with Tulu 2},
123
- author={Hamish Ivison and Yizhong Wang and Valentina Pyatkin and Nathan Lambert and Matthew Peters and Pradeep Dasigi and Joel Jang and David Wadden and Noah A. Smith and Iz Beltagy and Hannaneh Hajishirzi},
124
- year={2023},
125
- archivePrefix={arXiv},
126
- primaryClass={cs.CL}
 
127
  }
128
  ```
129
 
 
18
  Tulu is a series of language models that are trained to act as helpful assistants.
19
  Tulu 1 llama2 7B is a fine-tuned version of Llama 2 that was trained on a mix of publicly available, synthetic and human datasets.
20
  Specifically, this model is trained on our v1 Tulu data mixture.
21
+
22
+ For more details, read the paper: [Camels in a Changing Climate: Enhancing LM Adaptation with Tulu 2
23
+ ](https://arxiv.org/abs/2311.10702).
24
+
25
 
26
 
27
  ## Model description
 
121
  If you find Tulu 2 is useful in your work, please cite it with:
122
 
123
  ```
124
+ @misc{ivison2023camels,
125
+ title={Camels in a Changing Climate: Enhancing LM Adaptation with Tulu 2},
126
+ author={Hamish Ivison and Yizhong Wang and Valentina Pyatkin and Nathan Lambert and Matthew Peters and Pradeep Dasigi and Joel Jang and David Wadden and Noah A. Smith and Iz Beltagy and Hannaneh Hajishirzi},
127
+ year={2023},
128
+ eprint={2311.10702},
129
+ archivePrefix={arXiv},
130
+ primaryClass={cs.CL}
131
  }
132
  ```
133