library_name: transformers | |
license: mit | |
language: | |
- en | |
# 48k vocab LlamaTokenizer for T5 | |
custom tokenizer from [scaling study](https://huggingface.co/sail/scaling-with-vocab-trained-tokenizers) adapted for T5 training | |
- Compression ratio: 3.54 | |
- Vocabulary size: 48228 | |
Tokens: | |
`['▁In', '▁', '2', '0', '2', '3', ',', '▁Dr', '.', '▁Jane', '▁Smith', '-', 'John', 'son', '▁published', '▁groundbreaking', '▁research', '▁on', '▁quantum', '▁ent', 'ang', 'lement', ',', '▁demonstrating', '▁a', '▁', '9', '9', '.', '9', '%', '▁success', '▁rate', '▁in', '▁tele', 'port', 'ing', '▁qu', 'bits', '▁over', '▁', '1', '0', '0', 'km', '▁using', '▁her', '▁patented', "▁'", 'Q', '-', 'Link', "'", '▁technology', '.', '</s>']` | |
![image/png](https://cdn-uploads.huggingface.co/production/uploads/60bccec062080d33f875cd0c/KL4UbQpJESQgnAf3FTtiS.png) |