Model Details
This is a AWQ GEMM quant of magnum-v3-34b: https://huggingface.co/anthracite-org/magnum-v3-34b
Model Description
Model has been quantized on 6xRTX4090, here are quantization parameters:
"zero_point": True, "q_group_size": 128, "w_bit": 4, "version": "GEMM"
- Downloads last month
- 32
Model tree for andriadze/anthracite-magnum-v3-34b-awq
Base model
anthracite-org/magnum-v3-34b