Update README.md
Browse files
README.md
CHANGED
@@ -2,6 +2,7 @@
|
|
2 |
license: apache-2.0
|
3 |
datasets:
|
4 |
- c-s-ale/alpaca-gpt4-data
|
|
|
5 |
---
|
6 |
|
7 |
This repo provides the training checkpoint of LLaMA on the alpaca_data_gpt4 dataset via LoRA [MLP] on 8xA100(80G).
|
@@ -30,5 +31,4 @@ torchrun --nproc_per_node=8 finetune.py \
|
|
30 |
--group_by_length
|
31 |
```
|
32 |
|
33 |
-
> [1] Junxian He, Chunting Zhou, Xuezhe Ma, Taylor Berg-Kirkpatrick, Graham Neubig: Towards a Unified View of Parameter-Efficient Transfer Learning. ICLR 2022
|
34 |
-
|
|
|
2 |
license: apache-2.0
|
3 |
datasets:
|
4 |
- c-s-ale/alpaca-gpt4-data
|
5 |
+
pipeline_tag: text2text-generation
|
6 |
---
|
7 |
|
8 |
This repo provides the training checkpoint of LLaMA on the alpaca_data_gpt4 dataset via LoRA [MLP] on 8xA100(80G).
|
|
|
31 |
--group_by_length
|
32 |
```
|
33 |
|
34 |
+
> [1] Junxian He, Chunting Zhou, Xuezhe Ma, Taylor Berg-Kirkpatrick, Graham Neubig: Towards a Unified View of Parameter-Efficient Transfer Learning. ICLR 2022
|
|