Text Generation
Transformers
PyTorch
llama
text-generation-inference
Inference Endpoints
soujanyaporia commited on
Commit
ca531d5
1 Parent(s): 26e5215

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -1
README.md CHANGED
@@ -1,6 +1,9 @@
1
  ---
2
  license: apache-2.0
3
  library_name: transformers
 
 
 
4
  ---
5
 
6
  # Flacuna: A Vicuna made of Flan
@@ -39,4 +42,4 @@ As a result of this fine-tuning process, Flacuna exhibited notable performance i
39
  | Flacuna | 13B | 49.4 | 32.5 | 67.9 |
40
 
41
 
42
- During training, Flacuna employed a maximum input sequence length of 1280. We utilized LoRA for parameter-efficient fine-tuning.
 
1
  ---
2
  license: apache-2.0
3
  library_name: transformers
4
+ metrics:
5
+ - accuracy
6
+ - code_eval
7
  ---
8
 
9
  # Flacuna: A Vicuna made of Flan
 
42
  | Flacuna | 13B | 49.4 | 32.5 | 67.9 |
43
 
44
 
45
+ During training, Flacuna employed a maximum input sequence length of 1280. We utilized LoRA for parameter-efficient fine-tuning.