pmc-llama-13b-awq / README.md
alecocc's picture
Update README.md
3df32f3 verified
|
raw
history blame
638 Bytes
metadata
license: openrail
model_creator: axiong
model_name: PMC_LLaMA_13B

PMC_LLaMA_13B - AWQ

Description

This repo contains AWQ model files for PMC_LLaMA_13B.

About AWQ

AWQ is an efficient, accurate and blazing-fast low-bit weight quantization method, currently supporting 4-bit quantization. Compared to GPTQ, it offers faster Transformers-based inference with equivalent or better quality compared to the most commonly used GPTQ settings.