Files changed (1) hide show
  1. README.md +5 -7
README.md CHANGED
@@ -42,12 +42,10 @@ We record the perplexity achieved by our 30k-fine-tuned OPT models on segments o
42
 
43
  ## Bibtex
44
  ```
45
- @misc{chevalier2023adapting,
46
- title={Adapting Language Models to Compress Contexts},
47
- author={Alexis Chevalier and Alexander Wettig and Anirudh Ajith and Danqi Chen},
48
- year={2023},
49
- eprint={2305.14788},
50
- archivePrefix={arXiv},
51
- primaryClass={cs.CL}
52
  }
53
  ```
 
42
 
43
  ## Bibtex
44
  ```
45
+ @inproceedings{chevalier2023adapting,
46
+ title={Adapting Language Models to Compress Contexts},
47
+ author={Chevalier, Alexis and Wettig, Alexander and Ajith, Anirudh and Chen, Danqi},
48
+ booktitle={Empirical Methods in Natural Language Processing (EMNLP)},
49
+ year={2023}
 
 
50
  }
51
  ```