1littlecoder commited on
Commit
6cb8bb3
1 Parent(s): 5c1d51a

code formatting

Browse files
Files changed (1) hide show
  1. README.md +6 -6
README.md CHANGED
@@ -68,25 +68,25 @@ BLING has not been designed for end consumer-oriented applications, and there ha
68
  ## How to Get Started with the Model
69
 
70
  The fastest way to get started with BLING is through direct import in transformers:
71
-
72
  from transformers import AutoTokenizer, AutoModelForCausalLM
73
  tokenizer = AutoTokenizer.from_pretrained("llmware/bling-1.4b-0.1")
74
  model = AutoModelForCausalLM.from_pretrained("llmware/bling-1.4b-0.1")
75
-
76
 
77
  The BLING model was fine-tuned with a simple "\<human> and \<bot> wrapper", so to get the best results, wrap inference entries as:
78
-
79
  full_prompt = "\<human>\: " + my_prompt + "\n" + "\<bot>\: "
80
-
81
  The BLING model was fine-tuned with closed-context samples, which assume generally that the prompt consists of two sub-parts:
82
 
83
  1. Text Passage Context, and
84
  2. Specific question or instruction based on the text passage
85
 
86
  To get the best results, package "my_prompt" as follows:
87
-
88
  my_prompt = {{text_passage}} + "\n" + {{question/instruction}}
89
-
90
 
91
  ## Citation [optional]
92
 
 
68
  ## How to Get Started with the Model
69
 
70
  The fastest way to get started with BLING is through direct import in transformers:
71
+ ```
72
  from transformers import AutoTokenizer, AutoModelForCausalLM
73
  tokenizer = AutoTokenizer.from_pretrained("llmware/bling-1.4b-0.1")
74
  model = AutoModelForCausalLM.from_pretrained("llmware/bling-1.4b-0.1")
75
+ ```
76
 
77
  The BLING model was fine-tuned with a simple "\<human> and \<bot> wrapper", so to get the best results, wrap inference entries as:
78
+ ```
79
  full_prompt = "\<human>\: " + my_prompt + "\n" + "\<bot>\: "
80
+ ```
81
  The BLING model was fine-tuned with closed-context samples, which assume generally that the prompt consists of two sub-parts:
82
 
83
  1. Text Passage Context, and
84
  2. Specific question or instruction based on the text passage
85
 
86
  To get the best results, package "my_prompt" as follows:
87
+ ```
88
  my_prompt = {{text_passage}} + "\n" + {{question/instruction}}
89
+ ```
90
 
91
  ## Citation [optional]
92