File size: 729 Bytes
4f4abd8 f01e7df 4f4abd8 f01e7df 4f4abd8 f01e7df 4f4abd8 f01e7df 4f4abd8 f01e7df 4f4abd8 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 |
---
library_name: peft
base_model: NousResearch/Llama-2-7b-hf
---
# SeanLee97/bellm-llama-7b-nli
It is a pretrained weight for [BeLLM: Backward Dependency Enhanced Large Language Model for Sentence Embeddings (NAACL24)](https://arxiv.org/abs/2311.05296)
For more usage, please refer to: https://github.com/4AI/BeLLM
# Citation
```bibtex
@inproceedings{li2024bellm,
title = "BeLLM: Backward Dependency Enhanced Large Language Model for Sentence Embeddings",
author = "Li, Xianming and Li, Jing",
booktitle = "Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics",
year = "2024",
publisher = "Association for Computational Linguistics"
}
```
|