tokenizers / bytelevel /special_tokens_map.json
pietrolesci's picture
Upload folder using huggingface_hub
592b2b4 verified
raw
history blame contribute delete
65 Bytes
{
"eos_token": "<|endoftext|>",
"pad_token": "<|padding|>"
}