ekshat commited on
Commit
9fc1c62
1 Parent(s): 54cdabc

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +42 -0
README.md ADDED
@@ -0,0 +1,42 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ datasets:
3
+ - b-mc2/sql-create-context
4
+ language:
5
+ - en
6
+ library_name: transformers
7
+ pipeline_tag: text-generation
8
+ tags:
9
+ - text-2-sql
10
+ - text-generation
11
+ ---
12
+ # Model Description
13
+
14
+ Our Model is fine tuned on Llama-2 7B model on text-2-sql Dataset on alpaca format described by Meta. The dataset is provided by "b-mc2/sql-create-context" present on Huggingface . We have used QLora, Bits&Bytes, Accelerate and Transformers Library to implement PEFT concept. We have fine-tuned this model based on pre-trained llama-2 7B model provided by 'NousResearch/Llama-2-7b-chat-hf'.
15
+
16
+ # Inference
17
+ ```python
18
+
19
+ !pip install transformers accelerate
20
+
21
+ from transformers import pipeline
22
+ pipe = pipeline("text-generation", model="ekshat/Llama-2-7b-chat-finetune-for-text2sql")
23
+
24
+ # Load model directly
25
+ from transformers import AutoTokenizer, AutoModelForCausalLM
26
+
27
+ tokenizer = AutoTokenizer.from_pretrained("ekshat/Llama-2-7b-chat-finetune-for-text2sql")
28
+ model = AutoModelForCausalLM.from_pretrained("ekshat/Llama-2-7b-chat-finetune-for-text2sql")
29
+
30
+ context = "CREATE TABLE head (name VARCHAR, born_state VARCHAR, age VARCHAR)"
31
+ question = "List the name, born state and age of the heads of departments ordered by age."
32
+
33
+ prompt = f"""Below is an context that describes a sql query, paired with an question that provides further information. Write an answer that appropriately completes the request.
34
+ ### Context:
35
+ {context}
36
+ ### Question:
37
+ {question}
38
+ ### Answer:"""
39
+ pipe = pipeline(task="text-generation", model=model, tokenizer=tokenizer, max_length=200)
40
+ result = pipe(prompt)
41
+ print(result[0]['generated_text'])
42
+ ```