File size: 1,576 Bytes
a83e9e7
 
 
 
 
 
 
 
 
 
ee0501b
 
b026af8
a83e9e7
 
b4097bc
a83e9e7
b4097bc
 
 
a83e9e7
0fc0d6e
01865c1
a83e9e7
0fc0d6e
a83e9e7
 
 
b4097bc
a83e9e7
 
52dfaa5
0fc0d6e
52dfaa5
a83e9e7
 
 
111f901
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
---
license: apache-2.0
pipeline_tag: text-generation
language:
- da
tags:
- pretrained
inference:
  parameters:
    temperature: 0.7
datasets:
- DDSC/partial-danish-gigaword-no-twitter
base_model: mistralai/Mistral-7B-v0.1
---

# Model Card for Munin 7B Alpha

The Munin 7B Alpha Large Language Model (LLM) is a pretrained generative text model with 7 billion parameters, based on [Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1).

It has been trained on [Danish Gigaword](https://gigaword.dk/) using [continual pretraining](https://doi.org/10.48550/arXiv.2308.04014).

For full details of this model please read our [release blog post](https://foundationmodels.dk/blog/2024/01/11/releasing-munin-7b-alpha---a-danish-llm/). 
The code-base can be found on [our Git repo](https://github.com/centre-for-humanities-computing/danish-foundation-models).

**Note:** This model is an Alpha model. We don't recommend using this model in production. If you do use the model, please let us know.

## Notice

Munin 7B Alpha is, like Mistral 7B, a pretrained base model and therefore does not have any moderation mechanisms.


## Development

The model is developed by the [Danish Foundation Models Team](https://foundationmodels.dk)

## With Support From

- [Danish e-infrastructure Consortium](https://www.deic.dk/)
- [Acquisition and Logistics Organisation at the Danish Ministry of Defence](https://www.fmi.dk/)
- Danish Ministry of Higher Education and Science under [the Digital Security, Trust
  and Data Ethics performance contract](https://bedreinnovation.dk/)