How about original LLaMA Chat prompt format?

#3
by gotzmann - opened

What are the benefits of using ChatML (OpenAI?) prompt format with models that are supposed to be used with native LLaMA v2 Chat format? Maybe it's better to save the default format for any tunes on models based on LLaMA Chat (like StellarBright)?

Cognitive Computations org

It is good enough for ChatGPT, it is good enough for me.

Yeah, but LLaMA Chat already saw 400G tokens with concrete prompting (over 2T tokens with no prompting), so you basically ruining that gigantic pile of additional fine-tune learning with different prompting IMHO. And maybe that's the part of problems with Dolphin.

Cognitive Computations org

It is not.

I really wish more models were ChatML. Thanks for adopting it and making my life easier. πŸ‘

Cognitive Computations org

Thank you for the kind words :-)

ehartford changed discussion status to closed

Sign up or log in to comment