Edit model card

🔎Taiwan-inquiry_7B_v2.0.gguf

Name Quant method Bits Size Use case
Taiwan-inquiry_7B_v2.0-Q4_K_M.gguf Q4_K_M 4 4.54 GB medium, balanced quality - recommended
Taiwan-inquiry_7B_v2.0-Q5_K_M.gguf Q5_K_M 5 5.32 GB large, very low quality loss - recommended
Taiwan-inquiry_7B_v2.0-Q6_K.gguf Q6_K 6 6.14 GB very large, extremely low quality loss
Taiwan-inquiry_7B_v2.0-Q8_0.gguf Q8_0 8 7.96 GB very large, extremely low quality loss - not recommended
Taiwan-inquiry_7B_v2.0.gguf No quantization 16 or 32 15 GB very large, no quality loss - not recommended

Usage of the model

  • The user can take on the role of a doctor, and the model can engage in conversation with you as if it were a patient.
  • You can provide the model with a brief patient background in the system prompt, and the model will respond based on that prompt. (using my patient generator: colab)
  • Directly asking the certain disease about the symptoms and the possible therapies.(Warning: It's not medical advice!)

Reference

Downloads last month
509
GGUF
Model size
7.49B params
Architecture
llama

4-bit

5-bit

6-bit

8-bit

Inference API
Unable to determine this model's library. Check the docs .

Collection including ChenWeiLi/Taiwan-inquiry_7B_v2.0.gguf