简介 ChatGLM-6B Mirror

ChatGLM-6B 是一个开源的、支持中英双语的对话语言模型, 基于 General Language Model (GLM) 架构, 具有 62 亿参数。结合模型量化技术, 用户可以在消费级的显卡上进行本地部署(INT4 量化级别下最低只需 6GB 显存)。 ChatGLM-6B 使用了和 ChatGPT 相似的技术, 针对中文问答和对话进行了优化。经过约 1T 标识符的中英双语训练, 辅以监督微调、反馈自助、人类反馈强化学习等技术的加持, 62 亿参数的 ChatGLM-6B 已经能生成相当符合人类偏好的回答。

ChatGLM-6B is an open source, bilingual conversational language model based on the General Language Model (GLM) architecture with 6.2 billion parameters. Combined with model quantization techniques, it can be deployed locally on consumer-grade graphics cards (as low as 6GB of video memory at INT4 quantization level). ChatGLM-6B uses similar technology to ChatGPT, optimized for Chinese Q&A and conversation. With approximately 1T identifiers trained in both English and Chinese, and supported by supervised fine-tuning, feedback self-help, and human feedback reinforcement learning, ChatGLM-6B with 6.2 billion parameters is able to generate responses that are fairly consistent with human preferences.

使用 Usage

from modelscope import snapshot_download
model_dir = snapshot_download('Genius-Society/chatglm_6b')

维护 Maintenance

git clone git@hf.co:Genius-Society/chatglm_6b
cd chatglm_6b

引用 Reference

[1] ChatGLM-6B

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Unable to determine this model's library. Check the docs .