qwen2-math-7b-step-dpo-Q4_K_M-GGUF / qwen2-math-7b-step-dpo-q4_k_m.gguf

Commit History

Upload qwen2-math-7b-step-dpo-q4_k_m.gguf with huggingface_hub
dc8e42e
verified

kawchar85 commited on