whisper-medium-mlx-8bit

This model was converted to MLX format from medium.

Use with mlx

git clone https://github.com/ml-explore/mlx-examples.git
cd mlx-examples/whisper/
pip install -r requirements.txt

>> import whisper
>> whisper.transcribe("FILE_NAME")
Downloads last month
32
Inference API
Unable to determine this model’s pipeline type. Check the docs .

Collection including mlx-community/whisper-medium-mlx-8bit