Update README.md
Browse files
README.md
CHANGED
@@ -143,6 +143,19 @@ We notice that the file may be corrupted during transfer process. Please check M
|
|
143 |
| pytorch_model-00007-of-00007.bin | a967e2c6195477b7407089c0bffa2d53 |
|
144 |
|
145 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
146 |
<a id="chinese"></a>
|
147 |
|
148 |
## 模型简介
|
|
|
143 |
| pytorch_model-00007-of-00007.bin | a967e2c6195477b7407089c0bffa2d53 |
|
144 |
|
145 |
|
146 |
+
## Citation
|
147 |
+
If you find our [work](https://arxiv.org/abs/2311.02303) useful or helpful for your R&D works, please feel free to cite our paper as below.
|
148 |
+
```
|
149 |
+
@article{mftcoder2023,
|
150 |
+
title={MFTCoder: Boosting Code LLMs with Multitask Fine-Tuning},
|
151 |
+
author={Bingchang Liu and Chaoyu Chen and Cong Liao and Zi Gong and Huan Wang and Zhichao Lei and Ming Liang and Dajun Chen and Min Shen and Hailian Zhou and Hang Yu and Jianguo Li},
|
152 |
+
year={2023},
|
153 |
+
journal={arXiv preprint arXiv},
|
154 |
+
archivePrefix={arXiv},
|
155 |
+
eprint={2311.02303}
|
156 |
+
}
|
157 |
+
```
|
158 |
+
|
159 |
<a id="chinese"></a>
|
160 |
|
161 |
## 模型简介
|