Usage
from transformers import AutoTokenizer, AutoModelForCausalLM
tok = AutoTokenizer.from_pretrained('facebook/opt-350m')
model = AutoModelForCausalLM.from_pretrained('prasanna2003/opt-350m-instruct')
system_message = "You are AI language model helps the human."
input_prompt = "Define data science."
prompt = '<system>' + system_message + '<human>' + input_prompt + '<assistant>'
prompt = tokenizer(prompt, return_tensors='pt')
out = model.generate(**prompt, max_length=120)
print(tok.decode(out[0]))
- Downloads last month
- 108
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.