metadata
license: apache-2.0
pipeline_tag: text-generation
datasets:
- Nekochu/novel17_train_alpaca_format
language:
- fr
- en
Tool used:
hiyouga/LLaMA-Efficient-Tuning
qwopqwop200/GPTQ-for-LLaMa
Note: QLoRA training os environ Windows, Python 3.11, CUDA 11.8 on 24GB VRAM:
Know issue: load 4bit version in oobabooga/text-generation-webui give gibberish prompt, use ExLlama instead of AutoGPTQ