File size: 531 Bytes
44748cd 2711647 44748cd 2711647 9c8ada5 2711647 3551c7a 2711647 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 |
---
license: mit
pipeline_tag: text-generation
language:
- sv
- en
tags:
- pretrained
widget:
- text: "Jag tycker att det är roligt med"
---
# 🐈⬛ Mistral-7B-v0.1-flashback-v2
![](https://huggingface.co/timpal0l/Mistral-7B-v0.1-flashback-v2/resolve/main/flashcat.png?download=true)
Mistral-7B-v0.1-flashback-v2 model is a continuation of the pretraining process for the base Mistral-7B-v0.1 model, utilizing around 40GB of forum threads from the Swedish website flashback.org.
It is a full finetune for one epoch.
|