Update README.md
Browse files
README.md
CHANGED
@@ -18,7 +18,7 @@ pipeline_tag: text-generation
|
|
18 |
|
19 |
Latxa is a collection of foundation models specifically tuned for Basque. Based on Meta’s LLaMA 2 model family, these models were further trained with Euscrawl, a highly curated Basque corpora ([Artetxe et al., 2022](https://aclanthology.org/2022.emnlp-main.499/)). Ranging from 7 billion to 70 billion parameters, these models are currently the biggest and best-performing LLMs built for Basque. This is the 7b repository, links to other models can be found in the [Latxa Collection](https://huggingface.co/collections/HiTZ/latxa-65a697e6838b3acc53677304).
|
20 |
|
21 |
-
Read our [
|
22 |
|
23 |
# **Model Details**
|
24 |
|
|
|
18 |
|
19 |
Latxa is a collection of foundation models specifically tuned for Basque. Based on Meta’s LLaMA 2 model family, these models were further trained with Euscrawl, a highly curated Basque corpora ([Artetxe et al., 2022](https://aclanthology.org/2022.emnlp-main.499/)). Ranging from 7 billion to 70 billion parameters, these models are currently the biggest and best-performing LLMs built for Basque. This is the 7b repository, links to other models can be found in the [Latxa Collection](https://huggingface.co/collections/HiTZ/latxa-65a697e6838b3acc53677304).
|
20 |
|
21 |
+
Read more about Latxa in our [website](https://www.hitz.eus/en/node/340) or in [LinkedIn](https://www.linkedin.com/pulse/presenting-latxa-largest-language-model-built-basque-hitz-zentroa-63qdf)!
|
22 |
|
23 |
# **Model Details**
|
24 |
|