detr_finetuned

This model is a fine-tuned version of apkonsta/table-transformer-detection-ifrs on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 4914.5493

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss
4887.0844 0.2381 100 4915.8008
4806.7622 0.4762 200 4915.0762
4883.3641 0.7143 300 4914.7651
4869.7525 0.9524 400 4914.6387
4809.9978 1.1905 500 4914.6133
4820.81 1.4286 600 4914.6948
4779.5872 1.6667 700 4914.6743
5094.1991 1.9048 800 4914.5752
4851.7441 2.1429 900 4914.6494
4928.8484 2.3810 1000 4914.5767
4852.6178 2.6190 1100 4914.5840
4855.8131 2.8571 1200 4914.5991
4948.5747 3.0952 1300 4914.5967
4887.945 3.3333 1400 4914.5645
4900.1669 3.5714 1500 4914.5747
4937.1328 3.8095 1600 4914.5571
4792.3219 4.0476 1700 4914.6787
4842.8072 4.2857 1800 4914.5640
4914.0503 4.5238 1900 4914.6113
4892.0153 4.7619 2000 4914.5693
4882.0288 5.0 2100 4914.5630
4903.9891 5.2381 2200 4914.5679
4870.5566 5.4762 2300 4914.5688
4919.3287 5.7143 2400 4914.5508
4927.9272 5.9524 2500 4914.5488
4981.8925 6.1905 2600 4914.5537
4864.6322 6.4286 2700 4914.5835
4794.4006 6.6667 2800 4914.5820
4878.885 6.9048 2900 4914.5488
4967.0887 7.1429 3000 4914.5518
4937.0766 7.3810 3100 4914.5464
4829.3891 7.6190 3200 4914.5493
4812.0778 7.8571 3300 4914.5459
4823.5034 8.0952 3400 4914.5444
4919.2544 8.3333 3500 4914.5474
4838.375 8.5714 3600 4914.5581
4832.6153 8.8095 3700 4914.5513
4787.5813 9.0476 3800 4914.5464
4862.2234 9.2857 3900 4914.5464
4878.2669 9.5238 4000 4914.5474
4933.3856 9.7619 4100 4914.5488
4945.8159 10.0 4200 4914.5493

Framework versions

  • Transformers 4.48.3
  • Pytorch 2.5.1+cu124
  • Tokenizers 0.21.0
Downloads last month
1
Safetensors
Model size
28.8M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for Nihel13/detr_finetuned