mikecovlee
commited on
Commit
•
6f1329a
1
Parent(s):
c28e12d
Update README.md
Browse files
README.md
CHANGED
@@ -22,6 +22,10 @@ In experiments, MixLoRA achieves commendable performance across all evaluation m
|
|
22 |
|
23 |
The table above presents the performance of MixLoRA and compares these results with outcomes obtained by employing LoRA and DoRA for fine-tuning. The results demonstrate that the language model with MixLoRA achieves commendable performance across all evaluation methods. All methods are fine-tuned and evaluated with [meta-llama/Llama-2-7b-hf](https://huggingface.co/meta-llama/Llama-2-7b-hf) on m-LoRA, with all metrics reported as accuracy.
|
24 |
|
|
|
|
|
|
|
|
|
25 |
## Citation
|
26 |
If MixLoRA has been useful for your work, please consider citing it using the appropriate citation format for your publication.
|
27 |
```bibtex
|
|
|
22 |
|
23 |
The table above presents the performance of MixLoRA and compares these results with outcomes obtained by employing LoRA and DoRA for fine-tuning. The results demonstrate that the language model with MixLoRA achieves commendable performance across all evaluation methods. All methods are fine-tuned and evaluated with [meta-llama/Llama-2-7b-hf](https://huggingface.co/meta-llama/Llama-2-7b-hf) on m-LoRA, with all metrics reported as accuracy.
|
24 |
|
25 |
+
## How to Use
|
26 |
+
|
27 |
+
Please visit our GitHub repository: https://github.com/mikecovlee/mlora
|
28 |
+
|
29 |
## Citation
|
30 |
If MixLoRA has been useful for your work, please consider citing it using the appropriate citation format for your publication.
|
31 |
```bibtex
|