Update the tokenizer which inserted added_tokens to vocab

#75

Inserting added_tokens to vocab to fix error like this:

2024-06-05T05:20:37.878921Z  WARN tokenizers::tokenizer::serialization: /usr/local/cargo/registry/src/index.crates.io-6f17d22bba15001f/tokenizers-0.19.1/src/tokenizer/serialization.rs:159: Warning: Token '<|assistant|>' was expected to have ID '32001' but was given ID 'None'
...
Ready to merge
This branch is ready to get merged automatically.

Sign up or log in to comment