File size: 1,263 Bytes
d376499 800a6a4 87c426e 800a6a4 1ce3a30 800a6a4 87c426e ec7e65e 87c426e 800a6a4 87c426e 800a6a4 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 |
---
license: apache-2.0
language:
- th
library_name: transformers
pipeline_tag: text-generation
tags:
- pretrained
---
# Model Card for Typhoon-7B
**Typhoon 7B** is a pretrained Thai language adaption of Mistral-7B with 7 billion parameters.
**Typhoon 7B** outperforms all open-source Thai language models as of this publishing, and its performance is on par with GPT-3.5 while being 2.62 times more efficient.
<div align="center">
<img src="https://storage.googleapis.com/scb10x-ai-lab-public/assets/typhoon_benchmark.png" alt="Typhoon benchmark" width="100%" style="margin-left:'auto' margin-right:'auto' display:'block'"/>
</div>
For full details of this model please read our [paper]() and [release blog post]().
## Requirements
Transformers, 4.34.0 or newer.
## Model date
**Typhoon 7B** was trained at December, 2023.
## License
Apache-2.0 (Commercial)
## Notice
Typhoon 7B is a pretrained base model; it cannot understand human instructions without using a few-shot or fine-tune approach on an instruct dataset,
and does not have any moderation mechanisms.
## SCB10X AI Team
Kunat Pipatanakul, Phatrasek Jirabovonvisut, Potsawee Manakul, Sittipong Sripaisarnmongkol, Ruangsak Patomwong, Pathomporn Chokchainant, Kasima Tharnpipitchai
|