🇫🇷 Calme-7B
Collection
Calme fine-tuned models
•
20 items
•
Updated
•
7
Calme-12B is a state-of-the-art language model with 12 billion parameters, merged and fine-tuned over high-quality datasets on top of Calme-7B-Instruct-v0.9. The Calme-7B models excel in generating text that resonates with clarity, calmness, and coherence.
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("text-generation", model="MaziyarPanahi/Calme-12B-Instruct-v0.1")
# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("MaziyarPanahi/Calme-12B-Instruct-v0.1")
model = AutoModelForCausalLM.from_pretrained("MaziyarPanahi/Calme-12B-Instruct-v0.1")
I love how GGUF democratizes the use of Large Language Models (LLMs) on commodity hardware, more specifically, personal computers without any accelerated hardware. Because of this, I am committed to converting and quantizing any models I fine-tune to make them accessible to everyone!