Model Details
This is a AWQ GEMV quant of magnum-v3-34b: https://huggingface.co/anthracite-org/magnum-v3-34b
Model Description
Model has been quantized on 6xRTX4090, here are quantization parameters:
"zero_point": True, "q_group_size": 128, "w_bit": 4, "version": "GEMV"
- Downloads last month
- 6
Model tree for andriadze/anthracite-magnum-v3-34b-awq-gemv
Base model
anthracite-org/magnum-v3-34b