File size: 411 Bytes
5fa1a76
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
", return_tensors="pt")
outputs = model(encoded_input)
HerBERT can also be loaded using AutoTokenizer and AutoModel:
import torch
from transformers import AutoModel, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("allegro/herbert-klej-cased-tokenizer-v1")
model = AutoModel.from_pretrained("allegro/herbert-klej-cased-v1")

Herbert implementation is the same as BERT except for the tokenization method.