Ahmadzei's picture
added 3 more tables for large emb model
5fa1a76
from transformers import TrainingArguments
training_args = TrainingArguments(
output_dir="detr-resnet-50_finetuned_cppe5",
per_device_train_batch_size=8,
num_train_epochs=10,
fp16=True,
save_steps=200,
logging_steps=50,
learning_rate=1e-5,
weight_decay=1e-4,
save_total_limit=2,
remove_unused_columns=False,
push_to_hub=True,
)
Finally, bring everything together, and call [~transformers.Trainer.train]:
from transformers import Trainer
trainer = Trainer(
model=model,
args=training_args,
data_collator=collate_fn,
train_dataset=cppe5["train"],
tokenizer=image_processor,
)
trainer.train()
If you have set push_to_hub to True in the training_args, the training checkpoints are pushed to the
Hugging Face Hub.