metadata
language:
- en
library_name: transformers
license: apache-2.0
tags:
- gpt
- llm
- large language model
- h2o-llmstudio
thumbnail: >-
https://h2o.ai/etc.clientlibs/h2o/clientlibs/clientlib-site/resources/images/favicon.ico
pipeline_tag: text-generation
mlx-community/Hermes-2-Theta-Llama-3-8B-4bit
Model was converted to MLX format from NousResearch/Hermes-2-Theta-Llama-3-8B
using mlx-lm version 0.14.3.
Converted & uploaded by: @ucheog (Uche Ogbuji).
Refer to the original model card for more details on the model.
Use with mlx
pip install mlx-lm
from mlx_lm import load, generate
model, tokenizer = load('mlx-community/Hermes-2-Theta-Llama-3-8B-4bit')
response = generate(model, tokenizer, prompt='Hello! Tell me something good.', verbose=True)
Conversion command
python -m mlx_lm.convert --hf-path NousResearch/Hermes-2-Theta-Llama-3-8B --mlx-path ~/.local/share/models/mlx/Hermes-2-Theta-Llama-3-8B -q