File size: 686 Bytes
5fa1a76
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
Let's see how this looks in an example:
thon
from transformers import BertTokenizer, BertForSequenceClassification
import torch
tokenizer = BertTokenizer.from_pretrained("google-bert/bert-base-uncased")
model = BertForSequenceClassification.from_pretrained("google-bert/bert-base-uncased")
inputs = tokenizer("Hello, my dog is cute", return_tensors="pt")
labels = torch.tensor([1]).unsqueeze(0)  # Batch size 1
outputs = model(**inputs, labels=labels)

The outputs object is a [~modeling_outputs.SequenceClassifierOutput], as we can see in the
documentation of that class below, it means it has an optional loss, a logits, an optional hidden_states and
an optional attentions attribute.