trained_lukeB_model / added_tokens.json
hama3's picture
Upload tokenizer
a16db97
raw
history blame contribute delete
40 Bytes
{
"<ent2>": 32771,
"<ent>": 32770
}