File size: 168 Bytes
5fa1a76
 
1
2
One can pass Pandas DataFrames and strings to the tokenizer,
  and it will automatically create the input_ids and attention_mask (as shown in the usage examples below).