File size: 1,437 Bytes
9b6d955 39ead62 9b6d955 39ead62 9b6d955 39ead62 9b6d955 39ead62 9b6d955 39ead62 9b6d955 39ead62 9b6d955 39ead62 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 |
---
language:
- da
- sv
license: cc-by-4.0
library_name: transformers
tags:
- merge
- mergekit
base_model:
- danish-foundation-models/munin-7b-alpha
- timpal0l/Mistral-7B-v0.1-flashback-v2
---
# Danish-Swedish Merged Model
This is a merge of the following models, all based on `mistralai/Mistral-7B-v0.1`:
1. `danish-foundation-models/munin-7b-alpha`, continued pretraining on Danish data;
2. `timpal0l/Mistral-7B-v0.1-flashback-v2`, continued pretraining on Swedish data.
## Model Details
- **Merged by:** [Dan Saattrup Nielsen](https://www.saattrupdan.com/)
- **Model type:** Decoder model, based on `mistralai/Mistral-7B-v0.1`
- **Language(s):** Danish and Swedish
- **License:** [CC-BY-4.0](https://creativecommons.org/licenses/by/4.0/)
- **Merge configuration:**
```python
dict(
models=[
dict(
model="danish-foundation-models/munin-7b-alpha",
parameters=dict(
weight=1.0,
density=0.3,
),
),
dict(
model="timpal0l/Mistral-7B-v0.1-flashback-v2",
parameters=dict(
weight=1.0,
density=0.3,
),
),
],
merge_method="dare_ties",
random_seed=4242
base_model="mistralai/Mistral-7B-v0.1",
parameters=dict(
int8_mask=True,
normalize=True,
),
dtype="bfloat16",
)
``` |