Text Generation
Transformers
English
codegen
Inference Endpoints
File size: 812 Bytes
a86f257
 
 
 
 
 
 
ea2fb2f
5d4e97b
56ec5ba
 
40a32e4
56ec5ba
3a74a18
 
56ec5ba
40a32e4
1f410d2
 
 
 
 
 
a9612a5
 
 
e9cdda5
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
---
license: mit
datasets:
- mhhmm/leetcode-solutions-python
- deepmind/code_contests
language:
- en
library_name: transformers
pipeline_tag: text-generation
widget:
- text: "
  # Given an array of integers, return indices of the two numbers such that they add up to a specific target

  
  def twoSum(array, target) -> List[int]:
  "
  example_title: "Twosum problem"
---

LLM: [Salesforce/CodeGen-6B-Mono](https://huggingface.co/Salesforce/codegen-6B-mono)

I'm using [Peft](https://github.com/huggingface/peft) for tuning

Tuning: 
- [LoRA](https://github.com/microsoft/LoRA)
- [Leetcode](https://huggingface.co/datasets/mhhmm/leetcode-solutions-python)
- [Google Deepmind Code contests](https://huggingface.co/datasets/deepmind/code_contests)
- Google Colab Pro+ in ~2 hours, shoutout to my friend TieuPhuong