--- datasets: - wikipedia - mc4 - cc100 - oscar-corpus/OSCAR-2301 - oscar-corpus/OSCAR-2201 - cerebras/SlimPajama-627B language: - ja license: apache-2.0 pipeline_tag: text-generation tags: - japanese-stablelm - causal-lm - mlx extra_gated_fields: Name: text Email: text Country: text Organization or Affiliation: text I allow Stability AI to contact me about information related to its models and research: checkbox --- # mlx-community/japanese-stablelm-base-gamma-7b-4bit The Model [mlx-community/japanese-stablelm-base-gamma-7b-4bit](https://huggingface.co/mlx-community/japanese-stablelm-base-gamma-7b-4bit) was converted to MLX format from [stabilityai/japanese-stablelm-base-gamma-7b](https://huggingface.co/stabilityai/japanese-stablelm-base-gamma-7b) using mlx-lm version **0.18.0**. ## Use with mlx ```bash pip install mlx-lm ``` ```python from mlx_lm import load, generate model, tokenizer = load("mlx-community/japanese-stablelm-base-gamma-7b-4bit") response = generate(model, tokenizer, prompt="hello", verbose=True) ```