textual_inversion_dog_2 / tokenizer /added_tokens.json
hangeol's picture
End of training
b135397 verified
raw
history blame contribute delete
64 Bytes
{
"<dog1>": 49408,
"<dog1>_1": 49409,
"<dog1>_2": 49410
}