Update README.md (add a link to Mixtral)
#1
by
nss-ysasaki
- opened
README.md
CHANGED
@@ -10,7 +10,7 @@ license: apache-2.0
|
|
10 |
|
11 |
# Swallow-MX-8x7b-NVE-v0.1
|
12 |
|
13 |
-
Our Swallow-MX-8x7b-NVE-v0.1 model has undergone continuous pre-training from the Mixtral-8x7B-Instruct-v0.1, primarily with the addition of Japanese language data.
|
14 |
|
15 |

|
16 |
|
|
|
10 |
|
11 |
# Swallow-MX-8x7b-NVE-v0.1
|
12 |
|
13 |
+
Our Swallow-MX-8x7b-NVE-v0.1 model has undergone continuous pre-training from the [Mixtral-8x7B-Instruct-v0.1](https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1), primarily with the addition of Japanese language data.
|
14 |
|
15 |

|
16 |
|