|
--- |
|
license: apache-2.0 |
|
language: |
|
- zh |
|
- en |
|
library_name: transformers |
|
pipeline_tag: text-generation |
|
--- |
|
原始模型:https://huggingface.co/TehVenom/Pygmalion-13b-Merged |
|
|
|
lora:https://huggingface.co/ziqingyang/chinese-alpaca-lora-13b |
|
|
|
将Pygmalion-13b与chinese-alpaca-lora-13b进行合并,增强模型的中文能力,~~不过存在翻译腔~~ |
|
|
|
使用项目: |
|
https://github.com/ymcui/Chinese-LLaMA-Alpaca |
|
|
|
https://github.com/qwopqwop200/GPTQ-for-LLaMa |
|
|
|
**兼容AutoGPTQ和GPTQ-for-LLaMa** |
|
**若选择GPTQ-for-LLaMa加载,请设置 Wbits=4 groupsize=128 model_type=llama** |
|
|
|
|
|
Text-generation-webui懒人包: |
|
https://www.bilibili.com/read/cv23495183 |
|
|
|
--- |
|
|
|
Original model: https://huggingface.co/TehVenom/Pygmalion-13b-Merged |
|
|
|
lora:https://huggingface.co/ziqingyang/chinese-alpaca-lora-13b |
|
|
|
The Pygmalion-13b model is combined with the chinese-alpaca-lora-13b model to enhance the model's Chinese language capabilities, ~~although there may be some translated tone~~. |
|
|
|
Usage projects: |
|
https://github.com/ymcui/Chinese-LLaMA-Alpaca |
|
|
|
https://github.com/qwopqwop200/GPTQ-for-LLaMa |
|
|
|
**Compatible with AutoGPTQ and GPTQ-for-LLaMa** |
|
**If you choose to load GPTQ-for-LLaMa, please set Wbits=4 groupsize=128 model_type=llama** |