No module named 'mlx_lm.models.mllama'

#2
by HomeroKzam - opened

I'm trying to use it with a simple example with"mlx-community/Llama-3.2-11B-Vision-Instruct-8bit", and it's returning the following error: No module named 'mlx_lm.models.mllama'. Or using "mlx-community/Phi-3.5-vision-instruct-bf16" model, the error: No module named 'mlx_lm.models.phi3_v'.

Command:
pip show mlx

Name: mlx
Version: 0.21.1
Summary: A framework for machine learning on Apple silicon.
Home-page: https://github.com/ml-explore/mlx
Author: MLX Contributors
Author-email: mlx@group.apple.com
License:
Location: /opt/homebrew/Caskroom/miniconda/base/envs/projects-env/lib/python3.12/site-packages
Requires:
Required-by: llama-index-llms-mlx, mlx-lm, mlx-vlm

pip show mlx_lm

Name: mlx-lm
Version: 0.20.1
Summary: LLMs on Apple silicon with MLX and the Hugging Face Hub
Home-page: https://github.com/ml-explore/mlx-examples
Author: MLX Contributors
Author-email: mlx@group.apple.com
License: MIT
Location: /opt/homebrew/Caskroom/miniconda/base/envs/projects-env/lib/python3.12/site-packages
Requires: jinja2, mlx, numpy, protobuf, pyyaml, transformers
Required-by: llama-index-llms-mlx

MLX Community org

Its because this is a Vision model and so you have to use the mlx-vlm package and not mlx-lm.

Sign up or log in to comment