prince-canuma's picture
Create README.md
8215697 verified
---
license: other
license_name: tongyi-qianwen
base_model: Qwen/Qwen2-72B
datasets:
- cognitivecomputations/Dolphin-2.9
- teknium/OpenHermes-2.5
- m-a-p/CodeFeedback-Filtered-Instruction
- cognitivecomputations/dolphin-coder
- cognitivecomputations/samantha-data
- microsoft/orca-math-word-problems-200k
- Locutusque/function-calling-chatml
- internlm/Agent-FLAN
library_name: transformers
tags:
- mlx
- axolotl
pipeline_tag: image-text-to-text
---
# mlx-community/dolphin-vision-72b-4bit
This model was converted to MLX format from [`cognitivecomputations/dolphin-vision-72b`]() using mlx-vlm version **0.0.11**.
Refer to the [original model card](https://huggingface.co/cognitivecomputations/dolphin-vision-72b) for more details on the model.
## Use with mlx
```bash
pip install -U mlx-vlm
```
```bash
python -m mlx_vlm.generate --model mlx-community/dolphin-vision-72b-4bit --max-tokens 100 --temp 0.0
```