File size: 395 Bytes
b1abe11 e72dfd2 12a4a96 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 |
---
license: apache-2.0
language:
- zh
- en
base_model:
- Qwen/Qwen2-7B-Instruct
- meta-llama/Llama-3.1-8B-Instruct
pipeline_tag: text-generation
---
## Training procedure
- total_batch_size: 32
- epoch: 3
- lr: 1.0e-4
- warm-up rate: 0.1
- type: Lora
## Framework versions
- LLaMA-Factory: v0.9.0
## Paper
- link: arxiv.org/abs/2412.04905
## Data
- link: https://github.com/MozerWang/DEMO
|