shubhayansarkar commited on
Commit
99bf20e
·
verified ·
1 Parent(s): 516f4fc

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +24 -12
README.md CHANGED
@@ -24,20 +24,32 @@ Copy code
24
  from transformers import pipeline
25
 
26
  # Load the model from Hugging Face
27
- qa_pipeline = pipeline("question-answering", model="your_model_name")
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
28
 
29
  # Ask a question
30
- data = {
31
- "question": "State the law of reflection and explain its applications.",
32
- "context": "ICSE Physics Class 9"
33
- }
34
  response = qa_pipeline(data)
35
  print(response["answer"])
36
- Training Details
37
- Dataset: Curated ICSE Physics content for Classes 9 and 10, including textbooks, sample papers, and online resources.
38
- Model Base: [Insert Base Model Name, e.g., BERT, GPT-3, Llama 2]
39
  Loss Function: Cross-entropy loss
40
- Final Training Loss: 0.21
41
- Evaluation Metric: Achieved a BLEU score of 88.3 on ICSE-specific Physics QA datasets.
42
- Training Framework: [Insert framework, e.g., PyTorch, Hugging Face Transformers]
43
- Limitations
 
24
  from transformers import pipeline
25
 
26
  # Load the model from Hugging Face
27
+ import torch
28
+ from transformers import AutoModelForCausalLM, AutoTokenizer
29
+
30
+ model = AutoModelForCausalLM.from_pretrained(
31
+ "pitangent-ds/academic_phy",
32
+ load_in_4bit=True, # Quantized model
33
+ device_map="auto",
34
+ # llm_int8_enable_fp32_cpu_offload=True
35
+ )
36
+ tokenizer = AutoTokenizer.from_pretrained("pitangent-ds/academic_phy")
37
+
38
+ # Perform inference
39
+ text = "What are units ?"
40
+ inputs = tokenizer(text, return_tensors="pt")
41
+ outputs = model.generate(**inputs)
42
+ decoded_output = tokenizer.decode(outputs[0], skip_special_tokens=True)
43
+
44
+ print(decoded_output)
45
 
46
  # Ask a question
47
+ question = "what are units?"
 
 
 
48
  response = qa_pipeline(data)
49
  print(response["answer"])
50
+
51
+ # Training Details
52
+ Dataset: Curated ICSE Physics content for Classes 9 and 10 textbooks
53
  Loss Function: Cross-entropy loss
54
+ Final Training Loss: 0.88
55
+ Training Framework: PyTorch, Hugging Face Transformers