OGNO-7b-dpo-truthful / added_tokens.json
eren23's picture
Upload folder using huggingface_hub
d177c9c verified
raw
history blame
42 Bytes
{
"</s>": 2,
"<s>": 1,
"<unk>": 0
}