File size: 181 Bytes
bc6e7dd
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
{
  "model_max_length": 65536,
  "tokenizer_class": "GPT4Tokenizer",
  "auto_map": {
    "AutoTokenizer": [
      "tokenization_transnormerllm.GPT4Tokenizer",
      null
    ]
  }
}