GIRT-Model
paper: https://arxiv.org/abs/2402.02632
demo: https://huggingface.co/spaces/nafisehNik/girt-space
This model is fine-tuned to generate issue report templates based on the input instruction provided. It has been fine-tuned on GIRT-Instruct data.
Usage
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
# load model and tokenizer
model = AutoModelForSeq2SeqLM.from_pretrained('nafisehNik/girt-t5-base')
tokenizer = AutoTokenizer.from_pretrained(nafisehNik/girt-t5-base)
# method for computing issue report template generation
def compute(sample, top_p, top_k, do_sample, max_length, min_length):
inputs = tokenizer(sample, return_tensors="pt").to('cpu')
outputs = model.generate(
**inputs,
min_length= min_length,
max_length=max_length,
do_sample=do_sample,
top_p=top_p,
top_k=top_k).to('cpu')
generated_texts = tokenizer.batch_decode(outputs, skip_special_tokens=False)
generated_text = generated_texts[0]
replace_dict = {
'\n ': '\n',
'</s>': '',
'<pad> ': '',
'<pad>': '',
'<unk>!--': '<!--',
'<unk>': '',
}
postprocess_text = generated_text
for key, value in replace_dict.items():
postprocess_text = postprocess_text.replace(key, value)
return postprocess_text
prompt = "YOUR INPUT INSTRUCTION"
result = compute(prompt, top_p = 0.92, top_k=0, do_sample=True, max_length=300, min_length=30)
Citation
@article{nikeghbal2024girt,
title={GIRT-Model: Automated Generation of Issue Report Templates},
author={Nikeghbal, Nafiseh and Kargaran, Amir Hossein and Heydarnoori, Abbas},
journal={arXiv preprint arXiv:2402.02632},
year={2024}
}
- Downloads last month
- 10
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.