Please pass the argumenttrust_remote_code=Trueto allow custom code to be run.

#54
by mherfarhan - opened
This comment has been hidden

Hi @mherfarhan , I think it should be -

from transformers import AutoModelForCausalLM
llm = AutoModelForCausalLM.from_pretrained("microsoft/phi-2", trust_remote_code=True)
tokenizer = AutoTokenizer.from_pretrained("microsoft/phi-2")

then pass the temperature and max length during the generation -

tokens = tokenizer("Hi this is Frodo")
llm.generate(**tokens, temperature=0, max_length=500)
mherfarhan changed discussion status to closed

Sign up or log in to comment