khaimaitien commited on
Commit
5fd2ec4
1 Parent(s): da8d6d0

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +35 -0
README.md ADDED
@@ -0,0 +1,35 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Model Card for qa-expert-7B-V1.0-GGUF
2
+
3
+ <!-- Provide a quick summary of what the model is/does. -->
4
+ This repo contains the GGUF format model files for [khaimaitien/qa-expert-7B-V1.0](https://huggingface.co/khaimaitien/qa-expert-7B-V1.0).
5
+
6
+ You can get more information about how to **use/train** the model from this repo: https://github.com/khaimt/qa_expert
7
+
8
+ ### Model Sources [optional]
9
+
10
+ <!-- Provide the basic links for the model. -->
11
+
12
+ - **Repository:** [https://github.com/khaimt/qa_expert]
13
+
14
+ ## How to Get Started with the Model
15
+ First, you need to clone the repo: https://github.com/khaimt/qa_expert
16
+
17
+ Then install the requirements:
18
+
19
+ ```shell
20
+ pip install -r requirements.txt
21
+ ```
22
+ Then install [llama-cpp-python](https://github.com/abetlen/llama-cpp-python)
23
+
24
+ Here is the example code:
25
+
26
+ ```python
27
+ from qa_expert import get_inference_model, InferenceType
28
+ def retrieve(query: str) -> str:
29
+ # You need to implement this retrieval function, input is a query and output is a string
30
+ # This can be treated as the function to call in function calling of OpenAI
31
+ return context
32
+
33
+ model_inference = get_inference_model(InferenceType.llama_cpp, "qa-expert-7B-V1.0.q4_0.gguf")
34
+ answer, messages = model_inference.generate_answer(question, retriever_func)
35
+ ```