Produced Output Stability

#13
by luissimoes - opened

Hi,

I have been trying the model, and I wanted to clarify one thing...
Is there a way to make outputs more predictable? I am making same prompt and obtaining different answers every time.
Is there a way to keep the answers stable and consistently producing same expected output?

Thanks

Which software are you using?
In llama.cpp main or server, you need to set --seed to a number. You should get a reproducable deterministic result for each seed.

I am using LangChain for now.

Is the seed something that is configured on the config section?

Sign up or log in to comment