llama7b_8bit_128g / README.md
Facico's picture
Update README.md
fcd1124
|
raw
history blame
419 Bytes
metadata
license: gpl-3.0

8-bit quantization and 128 groupsize for LLaMA 7B

This is a Chinese instruction-tuning lora checkpoint based on llama-13B from this repo's work Consumes approximately 8.5G of graphics memory

"input":the mean of life is
"output":the mean of life is 70 years.
the median age at death in a population, regardless if it's male or female?