File size: 426 Bytes
5fa1a76
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
",
     return_tensors="pt",
 )
generated_ids = model.generate(**inputs)
tokenizer.batch_decode(generated_ids, skip_special_tokens=True)
["Why You Shouldn't Quit Your Job"]
generated_ids = model_with_prompt.generate(**inputs)
tokenizer.batch_decode(generated_ids, skip_special_tokens=True)
["Don't do it if these are your reasons"]

For data-to-text generation, it is an example to use MVP and multi-task pre-trained variants.