sreeramajay commited on
Commit
62ff58f
1 Parent(s): b773950

model card

Browse files
Files changed (1) hide show
  1. README.md +74 -0
README.md ADDED
@@ -0,0 +1,74 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ datasets:
4
+ - Intel/orca_dpo_pairs
5
+ language:
6
+ - en
7
+ metrics:
8
+ - accuracy
9
+ pipeline_tag: text-generation
10
+ ---
11
+
12
+ Applied DPO to TinyLlama-1.1B-intermediate-step-1431k-3T using orca_dpo_pairs dataset
13
+
14
+ This is only experimental Model, Created by following instruction from the nice Blog [Fine-tune a Mistral-7b model with Direct Preference Optimization
15
+ ](https://towardsdatascience.com/fine-tune-a-mistral-7b-model-with-direct-preference-optimization-708042745aac)
16
+
17
+ You can run this model using the following code:
18
+
19
+ ```python
20
+ # Format prompt
21
+ message = [
22
+ {"role": "system", "content": "You are a helpful assistant chatbot."},
23
+ {"role": "user", "content": "What is a Large Language Model?"}
24
+ ]
25
+ tokenizer = AutoTokenizer.from_pretrained(new_model)
26
+ prompt = tokenizer.apply_chat_template(message, add_generation_prompt=True, tokenize=False)
27
+
28
+ # Create pipeline
29
+ pipeline = transformers.pipeline(
30
+ "text-generation",
31
+ model=new_model,
32
+ tokenizer=tokenizer
33
+ )
34
+
35
+ # Generate text
36
+ sequences = pipeline(
37
+ prompt,
38
+ do_sample=True,
39
+ temperature=0.7,
40
+ top_p=0.9,
41
+ num_return_sequences=1,
42
+ max_length=200,
43
+ )
44
+ print(sequences[0]['generated_text'])
45
+
46
+ # <s>[INST] <<SYS>>
47
+ # You are a helpful assistant chatbot.
48
+ # <</SYS>>
49
+ #
50
+ # What is a Large Language Model? [/INST]
51
+ # <LANG-LMT>
52
+ # Largely, it is a machine learning model that is trained on a large dataset and is capable of generating large amounts of text with a certain degree of accuracy.
53
+ #
54
+ # A: If you are talking about a computer program that can generate texts, you can look at the topic of Natural Language Generation (NLG) for a more precise definition.
55
+ # The main difference between NLG and machine learning is that NLG is a subfield of AI and is used to generate text from an input, while machine learning is used to analyze data, make predictions and classify it.
56
+
57
+ ```
58
+
59
+ Results on GPT4ALL benchmark:
60
+
61
+ | Tasks | Metric |Value | |Stderr|
62
+ |-------------|--------|-----:|---|-----:|
63
+ |arc_challenge|acc |0.2807|± |0.0131|
64
+ | |acc_norm|0.3106|± |0.0135|
65
+ |arc_easy |acc |0.6107|± |0.0100|
66
+ | |acc_norm|0.5547|± |0.0102|
67
+ |boolq |acc |0.5865|± |0.0086|
68
+ |hellaswag |acc |0.4478|± |0.0050|
69
+ | |acc_norm|0.5924|± |0.0049|
70
+ |openbookqa |acc |0.2160|± |0.0184|
71
+ | |acc_norm|0.3600|± |0.0215|
72
+ |piqa |acc |0.7280|± |0.0104|
73
+ | |acc_norm|0.7301|± |0.0104|
74
+ |winogrande |acc |0.5856|± |0.0138|