Edit model card

KAI-7B

image/png

KAI-7B Large Language Model (LLM) is a fine-tuned generative text model based on Mistral 7B. With over 7 billion parameters, KAI-7B outperforms its closest competetor, Meta-Llama 2 70b, in all benchmarks we tested.

image/png image/png image/png

As you can see in the benchmark above, KAI-7B excells in STEM but needs work in the Math and Coding fields.

Notice

KAI-7B is a pretrained base model and therefore does not have any moderation mechanisms.

Banned Use

KAI-7B is governed by the apache 2.0 liscense, and therefore means that whatever the license deems unacceptable shall not be allowed. We specificaly ban the use of ANY AND ALL KAI MODELS for hate speech towards a paricular thing, person, our particular group due to legal and ethical issues.

Downloads last month
24
Safetensors
Model size
7.24B params
Tensor type
BF16
·
Inference API
Model is too large to load in Inference API (serverless). To try the model, launch it on Inference Endpoints (dedicated) instead.

Datasets used to train Keynote-Technology/KAI-7B-v0.1

Space using Keynote-Technology/KAI-7B-v0.1 1

Collection including Keynote-Technology/KAI-7B-v0.1