Transformers
Inference Endpoints
ancatmara commited on
Commit
177e208
1 Parent(s): 2eabec3

Upload 3 files

Browse files
special_tokens_map.json ADDED
@@ -0,0 +1,9 @@
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": "<s>",
3
+ "cls_token": "<s>",
4
+ "eos_token": "</s>",
5
+ "mask_token": "<mask>",
6
+ "pad_token": "<pad>",
7
+ "sep_token": "</s>",
8
+ "unk_token": "<unk>"
9
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "clean_up_tokenization_spaces": true,
3
+ "model_max_length": 64,
4
+ "special_tokens": [
5
+ "<s>",
6
+ "<pad>",
7
+ "</s>",
8
+ "<unk>",
9
+ "<mask>",
10
+ "<true>",
11
+ "<false>"
12
+ ],
13
+ "tokenizer_class": "PreTrainedTokenizerFast"
14
+ }