julian-schelb commited on
Commit
a704a39
1 Parent(s): 6083051

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -3
README.md CHANGED
@@ -153,12 +153,11 @@ This model is limited by its training dataset of entity-annotated news articles
153
  * Liu, Y., Ott, M., Goyal, N., Du, J., Joshi, M., Chen, D., Levy, O., Lewis, M., Zettlemoyer, L., & Stoyanov, V.. (2019). RoBERTa: A Robustly Optimized BERT Pretraining Approach.
154
 
155
 
156
- ## Bibtex Citation
157
-
158
- ```text
159
 
160
  This model was fine-tuned for the following paper. This is how you can cite it if you like:
161
 
 
162
  @inproceedings{schelbECCEEntitycentricCorpus2022,
163
  title = {{ECCE}: {Entity}-centric {Corpus} {Exploration} {Using} {Contextual} {Implicit} {Networks}},
164
  url = {https://dl.acm.org/doi/10.1145/3487553.3524237},
 
153
  * Liu, Y., Ott, M., Goyal, N., Du, J., Joshi, M., Chen, D., Levy, O., Lewis, M., Zettlemoyer, L., & Stoyanov, V.. (2019). RoBERTa: A Robustly Optimized BERT Pretraining Approach.
154
 
155
 
156
+ ## Citation
 
 
157
 
158
  This model was fine-tuned for the following paper. This is how you can cite it if you like:
159
 
160
+ ```bibtex
161
  @inproceedings{schelbECCEEntitycentricCorpus2022,
162
  title = {{ECCE}: {Entity}-centric {Corpus} {Exploration} {Using} {Contextual} {Implicit} {Networks}},
163
  url = {https://dl.acm.org/doi/10.1145/3487553.3524237},