Upload README.md
Browse files
README.md
CHANGED
@@ -1,3 +1,111 @@
|
|
1 |
-
---
|
2 |
-
license: apache-2.0
|
3 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
license: apache-2.0
|
3 |
+
base_model:
|
4 |
+
- mistralai/Mistral-7B-Instruct-v0.2
|
5 |
+
pipeline_tag: question-answering
|
6 |
+
library_name: peft
|
7 |
+
tags:
|
8 |
+
- medical
|
9 |
+
- lifescience
|
10 |
+
- drugdiscovery
|
11 |
+
---
|
12 |
+
# ClinicalGPT-Pubmed-Instruct-V1.0
|
13 |
+
|
14 |
+
## Overview
|
15 |
+
ClinicalGPT-Pubmed-Instruct-V1.0 is a specialized language model fine-tuned on the mistralai/Mistral-7B-Instruct-v0.2 base model. While primarily trained on 10 million PubMed abstracts and titles, this model excels at generating responses to life science-related medical questions with relevant citations from various scientific sources.
|
16 |
+
|
17 |
+
## Key Features
|
18 |
+
- Built on Mistral-7B-Instruct-v0.2 base model
|
19 |
+
- Primary training on 10M PubMed abstracts and titles
|
20 |
+
- Generates answers with scientific citations from multiple sources
|
21 |
+
- Specialized for medical and life science domains
|
22 |
+
|
23 |
+
## Applications
|
24 |
+
- **Life Science Research**: Generate accurate, referenced answers for biomedical and healthcare queries
|
25 |
+
- **Pharmaceutical Industry**: Support healthcare professionals with evidence-based responses
|
26 |
+
- **Medical Education**: Aid students and educators with scientifically-supported content from various academic sources
|
27 |
+
|
28 |
+
## System Requirements
|
29 |
+
|
30 |
+
### GPU Requirements
|
31 |
+
- **Minimum VRAM**: 16-18 GB for inference in BF16 (BFloat16) precision
|
32 |
+
- **Recommended GPUs**:
|
33 |
+
- NVIDIA A100 (20GB) - Ideal for BF16 precision
|
34 |
+
- Any GPU with 16+ GB VRAM
|
35 |
+
- Performance may vary based on available memory
|
36 |
+
|
37 |
+
### Software Prerequisites
|
38 |
+
- Python 3.x
|
39 |
+
- PyTorch
|
40 |
+
- Transformers library
|
41 |
+
|
42 |
+
### Basic Implementation
|
43 |
+
```python
|
44 |
+
from transformers import AutoTokenizer, AutoModelForCausalLM
|
45 |
+
import torch
|
46 |
+
|
47 |
+
# Set parameters
|
48 |
+
model_dir = "rohitanurag/ClinicalGPT-Pubmed-Instruct-V1.0"
|
49 |
+
max_new_tokens = 1500
|
50 |
+
device = "cuda" if torch.cuda.is_available() else "cpu"
|
51 |
+
|
52 |
+
# Load tokenizer and model
|
53 |
+
tokenizer = AutoTokenizer.from_pretrained(model_dir)
|
54 |
+
model = AutoModelForCausalLM.from_pretrained(model_dir).to(device)
|
55 |
+
|
56 |
+
# Define your question
|
57 |
+
question = "What is the role of the tumor microenvironment in cancer progression?"
|
58 |
+
prompt = f"""Please provide the answer to the question asked.
|
59 |
+
### Question: {question}
|
60 |
+
### Answer: """
|
61 |
+
|
62 |
+
# Tokenize input
|
63 |
+
inputs = tokenizer(prompt, return_tensors="pt", padding=True, truncation=True).to(device)
|
64 |
+
|
65 |
+
# Generate output
|
66 |
+
output_ids = model.generate(
|
67 |
+
inputs.input_ids,
|
68 |
+
attention_mask=inputs.attention_mask,
|
69 |
+
max_new_tokens=1000,
|
70 |
+
repetition_penalty=1.2,
|
71 |
+
pad_token_id=tokenizer.eos_token_id,
|
72 |
+
)
|
73 |
+
|
74 |
+
# Decode and print
|
75 |
+
generated_text = tokenizer.decode(output_ids[0], skip_special_tokens=True)
|
76 |
+
print(f"Generated Answer:\n{generated_text}")
|
77 |
+
```
|
78 |
+
|
79 |
+
## Sample Output
|
80 |
+
```
|
81 |
+
### Question: What is the role of the tumor microenvironment in cancer progression, and how does it influence the response to therapy?
|
82 |
+
### Answer:
|
83 |
+
The tumor microenvironment (TME) refers to the complex network of cells, extracellular matrix components, signaling molecules, and immune cells that surround a growing tumor. It plays an essential role in regulating various aspects of cancer development and progression...
|
84 |
+
|
85 |
+
### References:
|
86 |
+
1. Hanahan D, Weinberg RA. Hallmarks of Cancer: The Next Generation. Cell. 2011;144(5):646-74. doi:10.1016/j.cell.2011.03.019
|
87 |
+
2. Coussens LM, Pollard JW. Angiogenesis and Metastasis. Nature Reviews Cancer. 2006;6(1):57-68. doi:10.1038/nrc2210
|
88 |
+
3. Mantovani A, et al. Cancer's Educated Environment: How the Tumour Microenvironment Promotes Progression. Cell. 2017;168(6):988-1001.e15. doi:10.1016/j.cell.2017.02.011
|
89 |
+
4. Cheng YH, et al. Targeting the Tumor Microenvironment for Improved Therapy Response. Journal of Clinical Oncology. 2018;34(18_suppl):LBA10001. doi:10.1200/JCO.2018.34.18_suppl.LBA10001
|
90 |
+
5. Kang YS, et al. Role of the Tumor Microenvironment in Cancer Immunotherapy. Current Opinion in Pharmacology. 2018;30:101-108. doi:10.1016/j.ycoop.20
|
91 |
+
```
|
92 |
+
|
93 |
+
## Model Details
|
94 |
+
- **Base Model**: Mistral-7B-Instruct-v0.2
|
95 |
+
- **Primary Training Data**: 10 million PubMed abstracts and titles
|
96 |
+
- **Specialization**: Medical question-answering with scientific citations
|
97 |
+
- **Output**: Generates detailed answers with relevant academic references
|
98 |
+
|
99 |
+
## Future Development
|
100 |
+
ClinicalGPT-Pubmed-Instruct-V2.0 is under development, featuring:
|
101 |
+
- Training on new 20 million pubmed articles
|
102 |
+
- Inclusion of full-text articles from various academic sources
|
103 |
+
- Enhanced performance for life science tasks
|
104 |
+
- Expanded citation capabilities across multiple scientific databases
|
105 |
+
|
106 |
+
## Contributors
|
107 |
+
- Rohit Anurag – Principal Data Scientist
|
108 |
+
- Aneesh Paul – Data Scientist
|
109 |
+
|
110 |
+
## License
|
111 |
+
Licensed under the Apache License, Version 2.0. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0
|