hassanaliemon asif00 commited on
Commit
5f5bd30
1 Parent(s): d9d9294

Added README.md (#1)

Browse files

- Added README.md (b11d7e7d5e66104f6cd9c374714641b54f74a216)


Co-authored-by: Abdullah Al Asif <asif00@users.noreply.huggingface.co>

Files changed (1) hide show
  1. README.md +62 -0
README.md ADDED
@@ -0,0 +1,62 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ datasets:
4
+ - BanglaLLM/bangla-alpaca
5
+ language:
6
+ - bn
7
+ library_name: transformers
8
+ pipeline_tag: question-answering
9
+ ---
10
+ # How to Use:
11
+
12
+ You can use the model with a pipeline for a high-level helper or load the model directly. Here's how:
13
+
14
+ ```python
15
+ # Use a pipeline as a high-level helper
16
+ from transformers import pipeline
17
+ pipe = pipeline("question-answering", model="hassanaliemon/bn_rag_llama3-8b")
18
+ ```
19
+
20
+ ```python
21
+ # Load model directly
22
+ from transformers import AutoTokenizer, AutoModelForCausalLM
23
+ tokenizer = AutoTokenizer.from_pretrained("hassanaliemon/bn_rag_llama3-8b")
24
+ model = AutoModelForCausalLM.from_pretrained("hassanaliemon/bn_rag_llama3-8b")
25
+ ```
26
+
27
+ # General Prompt Structure:
28
+
29
+ ```python
30
+ prompt = """Below is an instruction in Bengali language that describes a task, paired with an input also in Bengali language that provides further context. Write a response in Bengali language that appropriately completes the request.
31
+
32
+ ### Instruction:
33
+ {}
34
+
35
+ ### Input:
36
+ {}
37
+
38
+ ### Response:
39
+ {}
40
+ """
41
+ ```
42
+
43
+ # To get a cleaned up version of the response, you can use the `generate_response` function:
44
+
45
+ ```python
46
+ def generate_response(question, context):
47
+ inputs = tokenizer([prompt.format(question, context, "")], return_tensors="pt").to("cuda")
48
+ outputs = model.generate(**inputs, max_new_tokens=1024, use_cache=True)
49
+ responses = tokenizer.batch_decode(outputs, skip_special_tokens=True)[0]
50
+ response_start = responses.find("### Response:") + len("### Response:")
51
+ response = responses[response_start:].strip()
52
+ return response
53
+ ```
54
+
55
+ # Example Usage:
56
+
57
+ ```python
58
+ question = "ভারতীয় বাঙালি কথাসাহিত্যিক মহাশ্বেতা দেবীর মৃত্যু কবে হয় ?"
59
+ context = "২০১৬ সালের ২৩ জুলাই হৃদরোগে আক্রান্ত হয়ে মহাশ্বেতা দেবী কলকাতার বেল ভিউ ক্লিনিকে ভর্তি হন। সেই বছরই ২৮ জুলাই একাধিক অঙ্গ বিকল হয়ে তাঁর মৃত্যু ঘটে। তিনি মধুমেহ, সেপ্টিসেমিয়া ও মূত্র সংক্রমণ রোগেও ভুগছিলেন।"
60
+ answer = generate_response(question, context)
61
+ print(answer)
62
+ ```