mikecovlee
commited on
Commit
•
29186b0
1
Parent(s):
b3587d7
Update README.md
Browse files
README.md
CHANGED
@@ -27,25 +27,27 @@ The table above presents the performance of MixLoRA and compares these results w
|
|
27 |
|
28 |
## How to Use
|
29 |
|
30 |
-
Please visit our GitHub repository: https://github.com/mikecovlee/
|
31 |
|
32 |
## Citation
|
33 |
If MixLoRA has been useful for your work, please consider citing it using the appropriate citation format for your publication.
|
34 |
```bibtex
|
35 |
-
@misc{
|
36 |
-
|
37 |
-
|
38 |
-
|
39 |
-
|
40 |
-
|
|
|
|
|
41 |
}
|
42 |
|
43 |
@misc{alpaca-mixlora-7b,
|
44 |
-
author
|
45 |
-
title = {MixLoRA
|
46 |
year = {2024},
|
47 |
publisher = {HuggingFace Hub},
|
48 |
-
howpublished = {\url{https://huggingface.co/
|
49 |
}
|
50 |
```
|
51 |
|
|
|
27 |
|
28 |
## How to Use
|
29 |
|
30 |
+
Please visit our GitHub repository: https://github.com/mikecovlee/mLoRA
|
31 |
|
32 |
## Citation
|
33 |
If MixLoRA has been useful for your work, please consider citing it using the appropriate citation format for your publication.
|
34 |
```bibtex
|
35 |
+
@misc{li2024mixloraenhancinglargelanguage,
|
36 |
+
title={MixLoRA: Enhancing Large Language Models Fine-Tuning with LoRA-based Mixture of Experts},
|
37 |
+
author={Dengchun Li and Yingzi Ma and Naizheng Wang and Zhengmao Ye and Zhiyuan Cheng and Yinghao Tang and Yan Zhang and Lei Duan and Jie Zuo and Cal Yang and Mingjie Tang},
|
38 |
+
year={2024},
|
39 |
+
eprint={2404.15159},
|
40 |
+
archivePrefix={arXiv},
|
41 |
+
primaryClass={cs.CL},
|
42 |
+
url={https://arxiv.org/abs/2404.15159},
|
43 |
}
|
44 |
|
45 |
@misc{alpaca-mixlora-7b,
|
46 |
+
author={Dengchun Li and Yingzi Ma and Naizheng Wang and Zhengmao Ye and Zhiyuan Cheng and Yinghao Tang and Yan Zhang and Lei Duan and Jie Zuo and Cal Yang and Mingjie Tang},
|
47 |
+
title = {MixLoRA adapter based on AlpacaCleaned dataset and LLaMA-2-7B base model},
|
48 |
year = {2024},
|
49 |
publisher = {HuggingFace Hub},
|
50 |
+
howpublished = {\url{https://huggingface.co/TUDB-Labs/alpaca-mixlora-7b}},
|
51 |
}
|
52 |
```
|
53 |
|