GGUF
English
llama
medical
Inference Endpoints
turquise commited on
Commit
e0cd6f2
Β·
verified Β·
1 Parent(s): 98f0210

Delete readme.md

Browse files
Files changed (1) hide show
  1. readme.md +0 -144
readme.md DELETED
@@ -1,144 +0,0 @@
1
- # Fine-Tuning Llama-3.1 with Comprehensive Medical Q&A Dataset
2
-
3
- This project fine-tunes the **Llama-3.1 8B Model** using the **Comprehensive Medical Q&A Dataset** to build a specialized model capable of answering medical questions.
4
-
5
- ---
6
-
7
- ## πŸš€ Features
8
-
9
- - Fine-tuned on a diverse dataset of over **43,000 medical Q&A pairs**.
10
- - Supports **31 distinct types of medical queries**, including treatments, chronic diseases, and protocols.
11
- - Provides answers sourced from doctors, nurses, and pharmacists.
12
-
13
- ---
14
-
15
- ## πŸ“‚ Dataset Overview
16
-
17
- ### **Comprehensive Medical Q&A Dataset**
18
-
19
- - **Source:** [Huggingface Hub](https://huggingface.co/datasets/keivalya/MedQuad-MedicalQnADataset)
20
- - **License:** CC0 1.0 Universal (Public Domain Dedication)
21
-
22
- #### **Key Details**
23
- - **Total Questions:** 43,000+
24
- - **Categories:** 31 medical question types (`qtype`)
25
- - **Columns:**
26
- - `qtype`: Type of medical question (e.g., Treatment, Symptoms).
27
- - `Question`: Patient's medical question.
28
- - `Answer`: Expert response (from doctors, nurses, and pharmacists).
29
-
30
- ### **How the Dataset is Used**
31
- - **Filtering:** Questions are filtered by `qtype` for domain-specific fine-tuning.
32
- - **Analysis:** Queries are analyzed to understand patterns, such as correlations between treatments and chronic conditions.
33
- - **Applications:** Insights can be applied to build medical educational tools, predictive models, and virtual assistants.
34
-
35
- For more details, check the [dataset documentation](https://huggingface.co/datasets/keivalya/MedQuad-MedicalQnADataset).
36
-
37
- ---
38
-
39
- ## πŸ’» How to Use This Model
40
-
41
- The fine-tuned model is available on Hugging Face under the repository: [`turquise/MedQA_q8`](https://huggingface.co/turquise/MedQA_q8). Below are several ways to use the model:
42
-
43
- ### **Using llama-cpp-python Library**
44
- ```python
45
- from llama_cpp import Llama
46
-
47
- # Load the model
48
- llm = Llama.from_pretrained(
49
- repo_id="turquise/MedQA_q8",
50
- filename="medQA.Q8_0.gguf",
51
- )
52
-
53
- # Query the model
54
- output = llm(
55
- "What is Medullary Sponge Kidney?",
56
- max_tokens=512,
57
- echo=True
58
- )
59
- print(output)
60
- ```
61
-
62
- ### **Using llama.cpp**
63
- #### **Install via Homebrew**
64
- ```bash
65
- brew install llama.cpp
66
-
67
- llama-cli \
68
- --hf-repo "turquise/MedQA_q8" \
69
- --hf-file medQA.Q8_0.gguf \
70
- -p "What is Medullary Sponge Kidney?"
71
- ```
72
- #### **Use Pre-Built Binary**
73
- ```bash
74
- # Download pre-built binary from:
75
- # https://github.com/ggerganov/llama.cpp/releases
76
-
77
- ./llama-cli \
78
- --hf-repo "turquise/MedQA_q8" \
79
- --hf-file medQA.Q8_0.gguf \
80
- -p "What is Medullary Sponge Kidney?"
81
- ```
82
- #### **Build from Source Code**
83
- ```bash
84
- git clone https://github.com/ggerganov/llama.cpp.git
85
- cd llama.cpp
86
- cmake -B build -DLLAMA_CURL=ON
87
- cmake --build build -j --target llama-cli
88
-
89
- ./build/bin/llama-cli \
90
- --hf-repo "turquise/MedQA_q8" \
91
- --hf-file medQA.Q8_0.gguf \
92
- -p "What is Medullary Sponge Kidney?"
93
- ```
94
-
95
- ---
96
-
97
- ## πŸ€– Example Usages
98
- This model can assist with the following tasks:
99
-
100
- - Answering medical questions:
101
- ```python
102
- question = "What are the symptoms of diabetes?"
103
- output = llm(question, max_tokens=512)
104
- print(output)
105
- ```
106
- - Providing insights for healthcare education: Example: Answering queries about diseases, treatments, and chronic conditions.
107
- - Supporting virtual assistants by handling frequently asked healthcare-related questions.
108
-
109
- ---
110
-
111
- ## ⚠️ Disclaimer
112
-
113
- - This model **does not provide medical advice** and should not replace professional medical consultation.
114
- - For any health-related questions or concerns, please consult a doctor or a licensed healthcare professional.
115
-
116
- ---
117
-
118
- ## πŸ€– Applications
119
-
120
- This fine-tuned model can be used to:
121
- - Build **virtual assistants** and chatbots for healthcare-related queries.
122
- - Assist healthcare professionals by handling routine inquiries.
123
- - Enhance **medical education platforms** with AI-powered insights.
124
-
125
- ---
126
-
127
- ## πŸ“œ Acknowledgements
128
-
129
- - Dataset: [Huggingface Hub - MedQuad](https://huggingface.co/datasets/keivalya/MedQuad-MedicalQnADataset).
130
- - Fine-tuning framework: [Unsloth](https://github.com/unslothai/unsloth).
131
-
132
- If you use this project or dataset in your research, please credit the original authors.
133
-
134
- ---
135
-
136
- ## πŸ“ License
137
-
138
- This project is open-sourced under the **CC0 1.0 Universal License**. See the dataset [license details](https://creativecommons.org/publicdomain/zero/1.0/).
139
-
140
- ---
141
-
142
- ## πŸ“§ Contact
143
-
144
- For questions or collaboration, reach out via [HF Model Community](https://huggingface.co/turquise/MedQA_q8/discussions).