munin-7b-alpha / README.md
KennethEnevoldsen's picture
Update README.md
52dfaa5 verified
metadata
license: apache-2.0
pipeline_tag: text-generation
language:
  - da
tags:
  - pretrained
inference:
  parameters:
    temperature: 0.7
datasets:
  - DDSC/partial-danish-gigaword-no-twitter
base_model: mistralai/Mistral-7B-v0.1

Model Card for Munin 7B Alpha

The Munin 7B Alpha Large Language Model (LLM) is a pretrained generative text model with 7 billion parameters, based on Mistral-7B-v0.1.

It has been trained on Danish Gigaword using continual pretraining.

For full details of this model please read our release blog post. The code-base can be found on our Git repo.

Note: This model is an Alpha model. We don't recommend using this model in production. If you do use the model, please let us know.

Notice

Munin 7B Alpha is, like Mistral 7B, a pretrained base model and therefore does not have any moderation mechanisms.

Development

The model is developed by the Danish Foundation Models Team

With Support From