Text Generation
Transformers
English
codegen
Inference Endpoints
mhhmm commited on
Commit
a9612a5
1 Parent(s): 1f410d2

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -10
README.md CHANGED
@@ -11,15 +11,10 @@ pipeline_tag: text-generation
11
 
12
  LLM: [Salesforce/CodeGen-6B-Mono](https://huggingface.co/Salesforce/codegen-6B-mono)
13
 
14
- Tuning:
15
-
16
  I'm using [Peft](https://github.com/huggingface/peft) for tuning
17
- - [x] LoRA
18
- - [ ] Prefix-tuning (P-tuning v2)
19
 
20
- Dataset:
21
- - [x] [Leetcode](https://huggingface.co/datasets/mhhmm/leetcode-solutions-python)
22
- - [x] [Google Deepind Code contests](https://huggingface.co/datasets/deepmind/code_contests)
23
-
24
- Training:
25
- - [x] Google Colab Pro+ in 2 hours
 
11
 
12
  LLM: [Salesforce/CodeGen-6B-Mono](https://huggingface.co/Salesforce/codegen-6B-mono)
13
 
 
 
14
  I'm using [Peft](https://github.com/huggingface/peft) for tuning
 
 
15
 
16
+ Tuning:
17
+ - [LoRA](https://github.com/microsoft/LoRA)
18
+ - [Leetcode](https://huggingface.co/datasets/mhhmm/leetcode-solutions-python)
19
+ - [Google Deepind Code contests](https://huggingface.co/datasets/deepmind/code_contests)
20
+ - Google Colab Pro+ in 2 hours, shoutout to my friend TieuPhuong