Converted version of Sosaka/Alpaca-native-4bit-ggml with ggjt magic for use in llama.cpp or pyllamacpp. Full credit goes to Sosaka

Usage via pyllamacpp

Installation: pip install pyllamacpp

Download and inference:

from huggingface_hub import hf_hub_download
from pyllamacpp.model import Model

#Download the model
hf_hub_download(repo_id="LLukas22/alpaca-native-7B-4bit-ggjt", filename="ggjt-model.bin", local_dir=".")

#Load the model
model = Model(ggml_model="ggjt-model.bin", n_ctx=2000)

#Generate
prompt="The meaning of life is"

result=model.generate(prompt,n_predict=50)
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.

Dataset used to train LLukas22/alpaca-native-7B-4bit-ggjt