Ahmadzei's picture
added 3 more tables for large emb model
5fa1a76
raw
history blame contribute delete
400 Bytes
A study on how the immune system"]
``
</pt>
<tf>
Tokenize the text and return theinput_ids` as TensorFlow tensors:
from transformers import AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("username/my_awesome_eli5_clm-model")
inputs = tokenizer(prompt, return_tensors="tf").input_ids
Use the [~transformers.generation_tf_utils.TFGenerationMixin.generate] method to create the summarization.