doberst commited on
Commit
7ba7908
1 Parent(s): 83afefb

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +18 -0
README.md CHANGED
@@ -12,6 +12,24 @@ BLING models are fine-tuned with distilled high-quality custom instruct datasets
12
  the objective of providing a high-quality Instruct model that is 'inference-ready' on a CPU laptop even
13
  without using any advanced quantization optimizations.
14
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
15
  ### Model Description
16
 
17
  <!-- Provide a longer summary of what this model is. -->
 
12
  the objective of providing a high-quality Instruct model that is 'inference-ready' on a CPU laptop even
13
  without using any advanced quantization optimizations.
14
 
15
+ ### **PERFORMANCE on BASIC RAG TEST DATASET**
16
+
17
+ | Model | Params (B) | Sourcing | GPU/CPU | Output Tokens | Out as % of Input | Process Time (secs) | Score (0-100) |
18
+ | :---------- | :--------: | :----: | :-----: | :---------: | :-------: | :--------: | :-------: |
19
+ | gpt-4 | <=1000 | Closed | Multi-GPU | 2665 | 10.53% | 183.8 | 100 |
20
+ | gpt-3.5-turbo-instruct| <=175 | Closed | Multi-GPU | 2621 | 11.49% | 62.7 | 100 |
21
+ | claude-instant-v1 | <=50 | Closed | Multi-GPU | 6337 | 26.50% | 154 | 100 |
22
+ | aib-read-gpt | 7 | Closed | GPU | 1964 | 9.30% | 114 | 96 |
23
+ | **bling_falcon-1b-0.1** | **1.3** | **Open** | **CPU** | **3204** | **14.55%** | **696** | **77** |
24
+ | bling_pythia-1.4b-0.1 | 1.4 | Open | CPU | 2589 | 11.75% | 593.5 | 65 |
25
+ | bling_pythia-1b-0.1 | 1.0 | Open | CPU | 2753 | 12.49% | 428 | 59 |
26
+ | bling_cerebras-1.3b | 1.3 | Open | CPU | 3202 | 20.01% | 690.1 | 52 |
27
+ | bling_pythia_410m | 0.41 | NA | CPU | 2349 | 10.66% | 189 | 36 |
28
+ | bling_cerebras_590m | 0.59 | NA | CPU | 4407 | 20.01% | 400.8 | 30 |
29
+
30
+ For more details on this evaluation, please see the dataset: **llmware/rag_instruct_test_dataset_0.1** and [BLOG](https://medium.com/@darrenoberst/evaluating-llm-performance-in-rag-instruct-use-cases-083dc272a31d)
31
+
32
+
33
  ### Model Description
34
 
35
  <!-- Provide a longer summary of what this model is. -->