Text Generation
Transformers
Safetensors
Thai
English
qwen2
text-generation-inference
sft
trl
4-bit precision
bitsandbytes
LoRA
Fine-Tuning with LoRA
LLM
GenAI
NT GenAI
ntgenai
lahnmah
NT Thai GPT
ntthaigpt
medical
medtech
HealthGPT
หลานม่า
NT Academy
conversational
Inference Endpoints
4-bit precision
Update README.md
Browse files
README.md
CHANGED
@@ -26,31 +26,19 @@ license: apache-2.0
|
|
26 |
new_version: Aekanun/openthaigpt-MedChatModelv5.1
|
27 |
---
|
28 |
|
29 |
-
# Model Card for `openthaigpt1.5-7b-medical-tuned`
|
30 |
-
|
31 |
-
![image/png](https://cdn-uploads.huggingface.co/production/uploads/663ce15f197afc063058dc3a/U0TIiWGdNaxl_9TH90gIx.png)
|
32 |
-
![image/png](https://cdn-uploads.huggingface.co/production/uploads/663ce15f197afc063058dc3a/mAZBm9Dk7-S-FQ4srj3aG.png)
|
33 |
-
![image/png](https://cdn-uploads.huggingface.co/production/uploads/663ce15f197afc063058dc3a/PgRsAWRPGw6T2tsF2aJ3W.png)
|
34 |
-
![image/png](https://cdn-uploads.huggingface.co/production/uploads/663ce15f197afc063058dc3a/lmreg4ibgBllTvzfhMeSU.png)
|
35 |
-
![image/png](https://cdn-uploads.huggingface.co/production/uploads/663ce15f197afc063058dc3a/cPJ3PWKcqwV2ynNWO1Qrs.png)
|
36 |
-
![image/png](https://cdn-uploads.huggingface.co/production/uploads/663ce15f197afc063058dc3a/mkM8VavlG9xHhgNlZ9E1X.png)
|
37 |
-
![image/png](https://cdn-uploads.huggingface.co/production/uploads/663ce15f197afc063058dc3a/MecCnAmLlYdpBjwJjMQFu.png)
|
38 |
-
![image/png](https://cdn-uploads.huggingface.co/production/uploads/663ce15f197afc063058dc3a/ijHMzw9Zrpm23o89vzsSc.png)
|
39 |
-
![image/png](https://cdn-uploads.huggingface.co/production/uploads/663ce15f197afc063058dc3a/hOIyuIA_zT7_s8SG-ZDWQ.png)
|
40 |
-
|
41 |
<!-- Provide a quick summary of what the model is/does. -->
|
42 |
This model is fine-tuned from `openthaigpt1.5-7b-instruct` using Supervised Fine-Tuning (SFT) on the `Thaweewat/thai-med-pack` dataset. The model is designed for medical question-answering tasks in Thai, specializing in providing accurate and contextual answers based on medical information.
|
43 |
|
44 |
-
|
|
|
45 |
|
46 |
-
|
47 |
This model was fine-tuned using Supervised Fine-Tuning (SFT) to optimize it for medical question answering in Thai. The base model is `openthaigpt1.5-7b-instruct`, and it has been enhanced with domain-specific knowledge using the `Thaweewat/thai-med-pack` dataset.
|
48 |
|
49 |
-
- **Developed by:** Amornpan Phornchaicharoen
|
50 |
-
- **Fine-tuned by:** Amornpan Phornchaicharoen
|
51 |
- **Model type:** Causal Language Model (AutoModelForCausalLM)
|
52 |
- **Language(s):** Thai
|
53 |
-
- **License:**
|
54 |
- **Fine-tuned from model:** `openthaigpt1.5-7b-instruct`
|
55 |
- **Dataset used for fine-tuning:** `Thaweewat/thai-med-pack`
|
56 |
|
@@ -89,6 +77,19 @@ This model can be used as a foundational model for medical assistance systems, c
|
|
89 |
- Limited to the medical domain.
|
90 |
- The model is sensitive to prompts and may generate off-topic responses for non-medical queries.
|
91 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
92 |
## How to Get Started with the Model
|
93 |
|
94 |
Here’s how to load and use the model for generating medical responses in Thai:
|
@@ -194,5 +195,6 @@ print(generated_text)
|
|
194 |
- มีความลำบากในการกิน มี
|
195 |
```
|
196 |
|
197 |
-
|
198 |
-
|
|
|
|
26 |
new_version: Aekanun/openthaigpt-MedChatModelv5.1
|
27 |
---
|
28 |
|
29 |
+
# 🇹🇭 Model Card for `openthaigpt1.5-7b-medical-tuned`
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
30 |
<!-- Provide a quick summary of what the model is/does. -->
|
31 |
This model is fine-tuned from `openthaigpt1.5-7b-instruct` using Supervised Fine-Tuning (SFT) on the `Thaweewat/thai-med-pack` dataset. The model is designed for medical question-answering tasks in Thai, specializing in providing accurate and contextual answers based on medical information.
|
32 |
|
33 |
+
### 👤 **Developed and Fine-tuned by:**
|
34 |
+
**Amornpan Phornchaicharoen**, **Aekanun Thongtae**
|
35 |
|
36 |
+
## Model Description
|
37 |
This model was fine-tuned using Supervised Fine-Tuning (SFT) to optimize it for medical question answering in Thai. The base model is `openthaigpt1.5-7b-instruct`, and it has been enhanced with domain-specific knowledge using the `Thaweewat/thai-med-pack` dataset.
|
38 |
|
|
|
|
|
39 |
- **Model type:** Causal Language Model (AutoModelForCausalLM)
|
40 |
- **Language(s):** Thai
|
41 |
+
- **License:** Apache License 2.0
|
42 |
- **Fine-tuned from model:** `openthaigpt1.5-7b-instruct`
|
43 |
- **Dataset used for fine-tuning:** `Thaweewat/thai-med-pack`
|
44 |
|
|
|
77 |
- Limited to the medical domain.
|
78 |
- The model is sensitive to prompts and may generate off-topic responses for non-medical queries.
|
79 |
|
80 |
+
|
81 |
+
## Model Training Results:
|
82 |
+
|
83 |
+
![image/png](https://cdn-uploads.huggingface.co/production/uploads/663ce15f197afc063058dc3a/U0TIiWGdNaxl_9TH90gIx.png)
|
84 |
+
![image/png](https://cdn-uploads.huggingface.co/production/uploads/663ce15f197afc063058dc3a/mAZBm9Dk7-S-FQ4srj3aG.png)
|
85 |
+
![image/png](https://cdn-uploads.huggingface.co/production/uploads/663ce15f197afc063058dc3a/PgRsAWRPGw6T2tsF2aJ3W.png)
|
86 |
+
![image/png](https://cdn-uploads.huggingface.co/production/uploads/663ce15f197afc063058dc3a/lmreg4ibgBllTvzfhMeSU.png)
|
87 |
+
![image/png](https://cdn-uploads.huggingface.co/production/uploads/663ce15f197afc063058dc3a/cPJ3PWKcqwV2ynNWO1Qrs.png)
|
88 |
+
![image/png](https://cdn-uploads.huggingface.co/production/uploads/663ce15f197afc063058dc3a/mkM8VavlG9xHhgNlZ9E1X.png)
|
89 |
+
![image/png](https://cdn-uploads.huggingface.co/production/uploads/663ce15f197afc063058dc3a/MecCnAmLlYdpBjwJjMQFu.png)
|
90 |
+
![image/png](https://cdn-uploads.huggingface.co/production/uploads/663ce15f197afc063058dc3a/ijHMzw9Zrpm23o89vzsSc.png)
|
91 |
+
![image/png](https://cdn-uploads.huggingface.co/production/uploads/663ce15f197afc063058dc3a/hOIyuIA_zT7_s8SG-ZDWQ.png)
|
92 |
+
|
93 |
## How to Get Started with the Model
|
94 |
|
95 |
Here’s how to load and use the model for generating medical responses in Thai:
|
|
|
195 |
- มีความลำบากในการกิน มี
|
196 |
```
|
197 |
|
198 |
+
### Authors
|
199 |
+
* Amornpan Phornchaicharoen (amornpan@gmail.com)
|
200 |
+
* Aekanun Thongtae (cto@bangkokfirsttech.com)
|