GERM-NT-2.5B-multi / special_tokens_map.json
Chenghao-Qiu's picture
Upload tokenizer
9f2a119 verified
raw
history blame contribute delete
101 Bytes
{
"cls_token": "<cls>",
"mask_token": "<mask>",
"pad_token": "<pad>",
"unk_token": "<unk>"
}