Ahmadzei's picture
added 3 more tables for large emb model
5fa1a76
thon
from transformers import PLBartForConditionalGeneration, PLBartTokenizer
tokenizer = PLBartTokenizer.from_pretrained("uclanlp/plbart-python-en_XX", src_lang="python", tgt_lang="en_XX")
example_python_phrase = "def maximum(a,b,c):NEW_LINE_INDENTreturn max([a,b,c])"
inputs = tokenizer(example_python_phrase, return_tensors="pt")
model = PLBartForConditionalGeneration.from_pretrained("uclanlp/plbart-python-en_XX")
translated_tokens = model.generate(**inputs, decoder_start_token_id=tokenizer.lang_code_to_id["en_XX"])
tokenizer.batch_decode(translated_tokens, skip_special_tokens=True)[0]
"Returns the maximum value of a b c."
Resources
Text classification task guide
Causal language modeling task guide
Translation task guide
Summarization task guide
PLBartConfig
[[autodoc]] PLBartConfig
PLBartTokenizer
[[autodoc]] PLBartTokenizer
- build_inputs_with_special_tokens
PLBartModel
[[autodoc]] PLBartModel
- forward
PLBartForConditionalGeneration
[[autodoc]] PLBartForConditionalGeneration
- forward
PLBartForSequenceClassification
[[autodoc]] PLBartForSequenceClassification
- forward
PLBartForCausalLM
[[autodoc]] PLBartForCausalLM
- forward