|
--- |
|
license: apache-2.0 |
|
language: |
|
- zh |
|
- en |
|
library_name: transformers |
|
pipeline_tag: text-generation |
|
--- |
|
原始模型:https://huggingface.co/Neko-Institute-of-Science/pygmalion-7b |
|
|
|
lora:https://huggingface.co/ziqingyang/chinese-alpaca-lora-7b |
|
|
|
将pygmalion-7b与chinese-alpaca-lora-7b进行合并,增强模型的中文能力,~~不过存在翻译腔~~ |
|
|
|
使用项目: |
|
https://github.com/ymcui/Chinese-LLaMA-Alpaca |
|
|
|
https://github.com/qwopqwop200/GPTQ-for-LLaMa |
|
|
|
**兼容AutoGPTQ和GPTQ-for-LLaMa** |
|
**若选择GPTQ-for-LLaMa加载,请设置 Wbits=4 groupsize=128 model_type=llama** |
|
|
|
|
|
Text-generation-webui懒人包: |
|
https://www.bilibili.com/read/cv23495183 |
|
|
|
--- |
|
|
|
Original model: https://huggingface.co/TheBloke/Wizard-Vicuna-13B-Uncensored-HF |
|
|
|
lora:https://huggingface.co/ziqingyang/chinese-alpaca-lora-7b |
|
|
|
The pygmalion-7b model is combined with the chinese-alpaca-lora-7b model to enhance the model's Chinese language capabilities, ~~although there may be some translated tone~~. |
|
|
|
Usage projects: |
|
https://github.com/ymcui/Chinese-LLaMA-Alpaca |
|
|
|
https://github.com/qwopqwop200/GPTQ-for-LLaMa |
|
|
|
**Compatible with AutoGPTQ and GPTQ-for-LLaMa** |
|
**If you choose to load GPTQ-for-LLaMa, please set Wbits=4 groupsize=128 model_type=llama** |