Text Generation
Transformers
PyTorch
English
gptj
Inference Endpoints
Z3R6X commited on
Commit
39541ed
1 Parent(s): 22eb51a

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -13,7 +13,7 @@ The model was finetuned with the following promt: \
13
  ``"Answer the following question in context:\n\nQuestion: " + samples["prompt"] + " Answer: "`` \
14
  It should be benefical to use the same or a similar prompt for inference.
15
 
16
- An increase of performance compared to [GPT4All-J v1.3](https://huggingface.co/nomic-ai/gpt4all-j) was observed when using two-shot Chain-of-Thought prompting.
17
 
18
  | HellaSwag | WinoGrande | BooLQ | ARC-c |
19
  |:------:|:------:|:------:|:------:|
 
13
  ``"Answer the following question in context:\n\nQuestion: " + samples["prompt"] + " Answer: "`` \
14
  It should be benefical to use the same or a similar prompt for inference.
15
 
16
+ An increase in performance compared to [GPT4All-J v1.3](https://huggingface.co/nomic-ai/gpt4all-j) was observed when using two-shot Chain-of-Thought prompting.
17
 
18
  | HellaSwag | WinoGrande | BooLQ | ARC-c |
19
  |:------:|:------:|:------:|:------:|