mvasiliniuc's picture
Add initial model datacard.
e5c691d
|
raw
history blame
2.11 kB
metadata
datasets:
  - mvasiliniuc/iva-kotlin-codeint-clean-train
  - mvasiliniuc/iva-kotlin-codeint-clean-valid
language:
  - code
tags:
  - gpt2
  - code
  - kotlin
  - mobile
  - generation
widget:
  - text: "/**\n\t* A function that returns the version of the current operating system.\n*/\n"
    example_title: Get current device operating system
  - text: "/**\n\t* A function that returns the current TimeZone.\n*/\n"
    example_title: Get current timezone
  - text: "/**\n\t* A data class representing a Bank Account.\n*/\n"
    example_title: Data Class - BankAccount

iva-codeint-kotlin-small GPT-2 is (small version - 239.4M parameters) trained from scratch to obtain results in the text-to-code task tailored for Kotlin language used in native mobile development (Android).

Usage

from transformers import pipeline

pipe = pipeline("text-generation", model="mvasiliniuc/iva-codeint-kotlin-small")
outputs = pipe("fun printToConsole()")

Inference

API_URL = "https://api-inference.huggingface.co/models/mvasiliniuc/iva-codeint-kotlin-small"
headers = {"Authorization": "Bearer <key>"}
def query(payload):
    response = requests.post(API_URL, headers=headers, json=payload)
    return response.json()

output = query({
"inputs": """
/**
 * A public function that returns the current version of the operating system.
 */
"""
})
pprint.pprint(output, compact=True)

Training

Config Value
seq length 1024
weight decay 0.1
learning rate 0.0005
max eval steps -1
shuffle buffer 10000
max train steps 150000
mixed precision fp16
num warmup steps 2000
train batch size 5
valid batch size 5
lr scheduler type cosine
save checkpoint steps 15000
gradient checkpointing false
gradient accumulation steps 1

Resources

Resources used for research: