Deci
/

Text Generation
Transformers
Safetensors
English
deci
Deci AI
DeciLM
custom_code
Eval Results
OferB commited on
Commit
b56a22e
1 Parent(s): 72c862f

Update hf_benchmark_example.py

Browse files
Files changed (1) hide show
  1. hf_benchmark_example.py +2 -2
hf_benchmark_example.py CHANGED
@@ -1,4 +1,4 @@
1
- ''''
2
  cmd example
3
  You need a file called "sample.txt" (default path) with text to take tokens for prompts or supply --text_file "path/to/text.txt" as an argument to a text file.
4
  You can use our attached "sample.txt" file with one of Deci's blogs as a prompt.
@@ -8,7 +8,7 @@ python time_hf.py --model Deci/DeciLM-6b
8
 
9
  # Run this and record tokens per second (136 tokens per second on A10 for meta-llama/Llama-2-7b-hf), CUDA OOM above batch size 8
10
  python time_hf.py --model meta-llama/Llama-2-7b-hf --batch_size 8
11
- ''''
12
 
13
  import json
14
 
 
1
+ """
2
  cmd example
3
  You need a file called "sample.txt" (default path) with text to take tokens for prompts or supply --text_file "path/to/text.txt" as an argument to a text file.
4
  You can use our attached "sample.txt" file with one of Deci's blogs as a prompt.
 
8
 
9
  # Run this and record tokens per second (136 tokens per second on A10 for meta-llama/Llama-2-7b-hf), CUDA OOM above batch size 8
10
  python time_hf.py --model meta-llama/Llama-2-7b-hf --batch_size 8
11
+ """
12
 
13
  import json
14