--- library_name: transformers tags: [] --- # Model Card for `openthaigpt1.5-7b-medical-tuned` ![image/png](https://cdn-uploads.huggingface.co/production/uploads/663ce15f197afc063058dc3a/U0TIiWGdNaxl_9TH90gIx.png) ![image/png](https://cdn-uploads.huggingface.co/production/uploads/663ce15f197afc063058dc3a/mAZBm9Dk7-S-FQ4srj3aG.png) ![image/png](https://cdn-uploads.huggingface.co/production/uploads/663ce15f197afc063058dc3a/PgRsAWRPGw6T2tsF2aJ3W.png) ![image/png](https://cdn-uploads.huggingface.co/production/uploads/663ce15f197afc063058dc3a/lmreg4ibgBllTvzfhMeSU.png) ![image/png](https://cdn-uploads.huggingface.co/production/uploads/663ce15f197afc063058dc3a/cPJ3PWKcqwV2ynNWO1Qrs.png) ![image/png](https://cdn-uploads.huggingface.co/production/uploads/663ce15f197afc063058dc3a/mkM8VavlG9xHhgNlZ9E1X.png) ![image/png](https://cdn-uploads.huggingface.co/production/uploads/663ce15f197afc063058dc3a/MecCnAmLlYdpBjwJjMQFu.png) ![image/png](https://cdn-uploads.huggingface.co/production/uploads/663ce15f197afc063058dc3a/ijHMzw9Zrpm23o89vzsSc.png) ![image/png](https://cdn-uploads.huggingface.co/production/uploads/663ce15f197afc063058dc3a/hOIyuIA_zT7_s8SG-ZDWQ.png) This model is fine-tuned from `openthaigpt1.5-7b-instruct` using Supervised Fine-Tuning (SFT) on the `Thaweewat/thai-med-pack` dataset. The model is designed for medical question-answering tasks in Thai, specializing in providing accurate and contextual answers based on medical information. ## Model Details ### Model Description This model was fine-tuned using Supervised Fine-Tuning (SFT) to optimize it for medical question answering in Thai. The base model is `openthaigpt1.5-7b-instruct`, and it has been enhanced with domain-specific knowledge using the `Thaweewat/thai-med-pack` dataset. - **Developed by:** Amornpan Phornchaicharoen - **Fine-tuned by:** Amornpan Phornchaicharoen - **Model type:** Causal Language Model (AutoModelForCausalLM) - **Language(s):** Thai - **License:** Amornpan Phornchaicharoen - **Fine-tuned from model:** `openthaigpt1.5-7b-instruct` - **Dataset used for fine-tuning:** `Thaweewat/thai-med-pack` ### Model Sources - **Repository:** [Link to your Hugging Face model repository] - **Base Model:** [Link to `openthaigpt1.5-7b-instruct` repository] - **Dataset:** [Link to `Thaweewat/thai-med-pack` repository] ## Uses ### Direct Use The model can be directly used for generating medical responses in Thai. It has been optimized for: - Medical question-answering - Providing clinical information - Health-related dialogue generation ### Downstream Use This model can be used as a foundational model for medical assistance systems, chatbots, and applications related to healthcare, specifically in the Thai language. ### Out-of-Scope Use - This model should not be used for real-time diagnosis or emergency medical scenarios. - Avoid using it for critical clinical decisions without human oversight, as the model is not intended to replace professional medical advice. ## Bias, Risks, and Limitations ### Bias - The model might reflect biases present in the dataset, particularly when addressing underrepresented medical conditions or topics. ### Risks - Responses may contain inaccuracies due to the inherent limitations of the model and the dataset used for fine-tuning. - This model should not be used as the sole source of medical advice. ### Limitations - Limited to the medical domain. - The model is sensitive to prompts and may generate off-topic responses for non-medical queries. ## How to Get Started with the Model Here’s how to load and use the model for generating medical responses in Thai: ```python from transformers import AutoModelForCausalLM, AutoTokenizer # Load the fine-tuned model and tokenizer tokenizer = AutoTokenizer.from_pretrained("amornpan/openthaigpt-MedChatModelv11") model = AutoModelForCausalLM.from_pretrained("amornpan/openthaigpt-MedChatModelv11") # Input your medical question or prompt in Thai input_text = "ใส่คำถามทางการแพทย์ที่นี่" inputs = tokenizer(input_text, return_tensors="pt") # Generate the output with a higher max length or max new tokens output = model.generate(**inputs, max_new_tokens=100, do_sample=True, temperature=0.7) # Decode and print the generated response, skipping special tokens print(tokenizer.decode(output[0], skip_special_tokens=True)) ``` โปรดอธิบายลักษณะช่องปากที่เป็นมะเร็งในระยะเริ่มต้น ช่องปากมะเร็งในระยะเริ่มต้น อาจไม่มีอาการชัดเจน แต่ผู้คนบางกลุ่มอาจสังเกตเห็นอาการต่อไปนี้: - มีการกัดหรือกระแทกบริเวณช่องปากโดยไม่มีสาเหตุ - มีจุด ฝี เมล็ด หรือความไม่เท่าเทียมภายในช่องปากที่ไม่หายวื้อ - ปวดหรือเจ็บบริเวณช่องปาก - เปลี่ยนแปลงสีของเนื้อเยื่อในช่องปาก (อาจเป็นสีขาว หรือ黑马) - มีตุ่มที่ไม่หาย ภายในช่องปาก - มีความลำบากในการกิน มี ### More Information ```amornpan@gmail.com```