Model card for Llama2chat_telugu (romanized)
meta-llama/Llama-2-7b-chat-hf is finetuned on tranliterated telugu sentences.
usage
from transformers import pipeline
pipe = pipeline(
"text-generation",
model="eswardivi/llama2chat_telugu",
device_map="auto"
)
def create_prompt(instruction: str, input: str = "") -> str:
prompt = f"""
### Instruction:
{instruction}
### Response:
"""
return prompt
prompt = create_prompt("Naku python Program 1 to 10 count cheyadaniki ivvu")
out = pipe(
prompt,
num_return_sequences=1,
max_new_tokens=1024,
temperature=0.7,
top_p=0.9,
do_sample=True
)
print(out[0]['generated_text'])
- Downloads last month
- 2
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.