Text Generation
Transformers
llama
Inference Endpoints
bhenrym14 commited on
Commit
16f0ffb
1 Parent(s): b8db8fb

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -43,7 +43,7 @@ Here I explore whether training on long sequences that have clear conceptual dep
43
 
44
  ## Relative Performance (perplexity)
45
 
46
- | Context (tokens) | airophin-13b-pntk-16k-fp16| bhenrym14/airoboros-13b-gpt4-1.4.1-PI-8192-fp16 |bhenrym14/airoboros-33b-gpt4-1.4.1-lxctx-PI-16384-fp16 | jondurbin/airoboros-l2-13b-gpt4-1.4.1 |
47
  | ---| ----- | -----| ------| --- |
48
  | 512 | 7.62 | 8.24 | 7.90 | **7.23** |
49
  | 1024 | 6.20 | 6.71 | 6.17 | **5.85** |
 
43
 
44
  ## Relative Performance (perplexity)
45
 
46
+ | Context (tokens) | bhenrym14/airophin-13b-pntk-16k-fp16| bhenrym14/airoboros-13b-gpt4-1.4.1-PI-8192-fp16 |bhenrym14/airoboros-33b-gpt4-1.4.1-lxctx-PI-16384-fp16 | jondurbin/airoboros-l2-13b-gpt4-1.4.1 |
47
  | ---| ----- | -----| ------| --- |
48
  | 512 | 7.62 | 8.24 | 7.90 | **7.23** |
49
  | 1024 | 6.20 | 6.71 | 6.17 | **5.85** |