File size: 561 Bytes
79feff5
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
---
license: other
tags:
- mlx
license_name: deepseek
license_link: LICENSE
pipeline_tag: image-text-to-text
---

# mlx-community/deepseek-vl-7b-chat-4bit
This model was converted to MLX format from [`deepseek-ai/deepseek-vl-7b-chat`]() using mlx-vlm version **0.0.8**.
Refer to the [original model card](https://huggingface.co/deepseek-ai/deepseek-vl-1.3b-chat) for more details on the model.
## Use with mlx

```bash
pip install -U mlx-vlm
```

```bash
python -m mlx_vlm.generate --model mlx-community/deepseek-vl-7b-chat-4bit --max-tokens 100 --temp 0.0
```