Experimental Tokenizer Fix
Browse filesGPTNeoXTokenizer does not exist as a function so we use a GPT2 fallback, with this commit I hope that HF begins to use GPTNeoXTokenizerFast instead which does exist.
- tokenizer_config.json +1 -1
tokenizer_config.json
CHANGED
@@ -1 +1 @@
|
|
1 |
-
{"unk_token": "<|endoftext|>", "bos_token": "<|endoftext|>", "eos_token": "<|endoftext|>", "add_prefix_space": false, "tokenizer_class": "
|
|
|
1 |
+
{"unk_token": "<|endoftext|>", "bos_token": "<|endoftext|>", "eos_token": "<|endoftext|>", "add_prefix_space": false, "tokenizer_class": "GPTNeoXTokenizerFast"}
|