Edit model card
MERAK

HAPPY TO ANNOUNCE THE RELEASE OF MERAK-7B-V4_4bit_q128_awq!

Merak-7B is the Large Language Model of Indonesian Language

This model is based on Mistral-7B-OpenOrca and fine tuned by some of Indonesia Wikipedia articles that I cleaned before.

Leveraging QLoRA (QLora: Efficient Finetuning of Quantized LLMs), Merak-7B is able to run with 16 GB VRAM

Licensed under Creative Commons-By Attribution-Share Alike-Non Commercial (CC-BY-SA-NC 4.0) Merak-7B empowers AI enthusiasts, researchers alike.

Big thanks to all my friends and communities that help to build our first model. Thanks for Axolotl for a great fine tuning tool which designed to streamline the fine-tuning of various AI models.

Feel free, to ask me about the model and please share the news on your social media.

Downloads last month
62
Safetensors
Model size
1.2B params
Tensor type
FP16
·
I32
·
Inference API
Input a message to start chatting with Ichsan2895/Merak-7B-v4_4bit_q128_awq.
This model can be loaded on Inference API (serverless).

Datasets used to train Ichsan2895/Merak-7B-v4_4bit_q128_awq