allenai-scibert_scivocab_uncased_20241230-091934
This model is a fine-tuned version of allenai/scibert_scivocab_uncased on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.1032
- Precision@0.01: 0.8166
- Recall@0.01: 0.9988
- F1@0.01: 0.8985
- Accuracy@0.01: 0.9027
- Precision@0.02: 0.8430
- Recall@0.02: 0.9980
- F1@0.02: 0.9140
- Accuracy@0.02: 0.9190
- Precision@0.03: 0.8560
- Recall@0.03: 0.9973
- F1@0.03: 0.9213
- Accuracy@0.03: 0.9265
- Precision@0.04: 0.8666
- Recall@0.04: 0.9966
- F1@0.04: 0.9271
- Accuracy@0.04: 0.9324
- Precision@0.05: 0.8740
- Recall@0.05: 0.9960
- F1@0.05: 0.9310
- Accuracy@0.05: 0.9364
- Precision@0.06: 0.8810
- Recall@0.06: 0.9959
- F1@0.06: 0.9350
- Accuracy@0.06: 0.9402
- Precision@0.07: 0.8870
- Recall@0.07: 0.9950
- F1@0.07: 0.9379
- Accuracy@0.07: 0.9431
- Precision@0.08: 0.8914
- Recall@0.08: 0.9950
- F1@0.08: 0.9404
- Accuracy@0.08: 0.9456
- Precision@0.09: 0.8959
- Recall@0.09: 0.9947
- F1@0.09: 0.9427
- Accuracy@0.09: 0.9479
- Precision@0.1: 0.9000
- Recall@0.1: 0.9942
- F1@0.1: 0.9448
- Accuracy@0.1: 0.9499
- Precision@0.11: 0.9031
- Recall@0.11: 0.9938
- F1@0.11: 0.9463
- Accuracy@0.11: 0.9513
- Precision@0.12: 0.9066
- Recall@0.12: 0.9931
- F1@0.12: 0.9479
- Accuracy@0.12: 0.9529
- Precision@0.13: 0.9089
- Recall@0.13: 0.9929
- F1@0.13: 0.9491
- Accuracy@0.13: 0.9540
- Precision@0.14: 0.9108
- Recall@0.14: 0.9923
- F1@0.14: 0.9498
- Accuracy@0.14: 0.9548
- Precision@0.15: 0.9138
- Recall@0.15: 0.9918
- F1@0.15: 0.9512
- Accuracy@0.15: 0.9561
- Precision@0.16: 0.9163
- Recall@0.16: 0.9913
- F1@0.16: 0.9524
- Accuracy@0.16: 0.9572
- Precision@0.17: 0.9184
- Recall@0.17: 0.9912
- F1@0.17: 0.9534
- Accuracy@0.17: 0.9582
- Precision@0.18: 0.9206
- Recall@0.18: 0.9908
- F1@0.18: 0.9544
- Accuracy@0.18: 0.9592
- Precision@0.19: 0.9218
- Recall@0.19: 0.9902
- F1@0.19: 0.9548
- Accuracy@0.19: 0.9596
- Precision@0.2: 0.9231
- Recall@0.2: 0.9899
- F1@0.2: 0.9553
- Accuracy@0.2: 0.9601
- Precision@0.21: 0.9244
- Recall@0.21: 0.9896
- F1@0.21: 0.9559
- Accuracy@0.21: 0.9606
- Precision@0.22: 0.9257
- Recall@0.22: 0.9893
- F1@0.22: 0.9564
- Accuracy@0.22: 0.9611
- Precision@0.23: 0.9268
- Recall@0.23: 0.9890
- F1@0.23: 0.9569
- Accuracy@0.23: 0.9615
- Precision@0.24: 0.9274
- Recall@0.24: 0.9889
- F1@0.24: 0.9571
- Accuracy@0.24: 0.9618
- Precision@0.25: 0.9284
- Recall@0.25: 0.9884
- F1@0.25: 0.9575
- Accuracy@0.25: 0.9621
- Precision@0.26: 0.9293
- Recall@0.26: 0.9882
- F1@0.26: 0.9579
- Accuracy@0.26: 0.9625
- Precision@0.27: 0.9302
- Recall@0.27: 0.9881
- F1@0.27: 0.9583
- Accuracy@0.27: 0.9629
- Precision@0.28: 0.9310
- Recall@0.28: 0.9881
- F1@0.28: 0.9587
- Accuracy@0.28: 0.9633
- Precision@0.29: 0.9320
- Recall@0.29: 0.9879
- F1@0.29: 0.9591
- Accuracy@0.29: 0.9637
- Precision@0.3: 0.9331
- Recall@0.3: 0.9876
- F1@0.3: 0.9596
- Accuracy@0.3: 0.9641
- Precision@0.31: 0.9338
- Recall@0.31: 0.9875
- F1@0.31: 0.9599
- Accuracy@0.31: 0.9644
- Precision@0.32: 0.9346
- Recall@0.32: 0.9874
- F1@0.32: 0.9603
- Accuracy@0.32: 0.9648
- Precision@0.33: 0.9353
- Recall@0.33: 0.9872
- F1@0.33: 0.9605
- Accuracy@0.33: 0.9650
- Precision@0.34: 0.9357
- Recall@0.34: 0.9872
- F1@0.34: 0.9608
- Accuracy@0.34: 0.9652
- Precision@0.35: 0.9364
- Recall@0.35: 0.9871
- F1@0.35: 0.9611
- Accuracy@0.35: 0.9655
- Precision@0.36: 0.9367
- Recall@0.36: 0.9869
- F1@0.36: 0.9612
- Accuracy@0.36: 0.9656
- Precision@0.37: 0.9373
- Recall@0.37: 0.9867
- F1@0.37: 0.9613
- Accuracy@0.37: 0.9658
- Precision@0.38: 0.9380
- Recall@0.38: 0.9863
- F1@0.38: 0.9616
- Accuracy@0.38: 0.9660
- Precision@0.39: 0.9388
- Recall@0.39: 0.9860
- F1@0.39: 0.9618
- Accuracy@0.39: 0.9662
- Precision@0.4: 0.9397
- Recall@0.4: 0.9852
- F1@0.4: 0.9619
- Accuracy@0.4: 0.9663
- Precision@0.41: 0.9408
- Recall@0.41: 0.9850
- F1@0.41: 0.9624
- Accuracy@0.41: 0.9668
- Precision@0.42: 0.9414
- Recall@0.42: 0.9849
- F1@0.42: 0.9627
- Accuracy@0.42: 0.9671
- Precision@0.43: 0.9414
- Recall@0.43: 0.9840
- F1@0.43: 0.9623
- Accuracy@0.43: 0.9667
- Precision@0.44: 0.9415
- Recall@0.44: 0.9840
- F1@0.44: 0.9623
- Accuracy@0.44: 0.9667
- Precision@0.45: 0.9415
- Recall@0.45: 0.9840
- F1@0.45: 0.9623
- Accuracy@0.45: 0.9667
- Precision@0.46: 0.9415
- Recall@0.46: 0.9840
- F1@0.46: 0.9623
- Accuracy@0.46: 0.9667
- Precision@0.47: 0.9415
- Recall@0.47: 0.9840
- F1@0.47: 0.9623
- Accuracy@0.47: 0.9667
- Precision@0.48: 0.9415
- Recall@0.48: 0.9840
- F1@0.48: 0.9623
- Accuracy@0.48: 0.9667
- Precision@0.49: 0.9415
- Recall@0.49: 0.9840
- F1@0.49: 0.9623
- Accuracy@0.49: 0.9667
- Precision@0.5: 0.9415
- Recall@0.5: 0.9840
- F1@0.5: 0.9623
- Accuracy@0.5: 0.9667
- Precision@0.51: 0.9415
- Recall@0.51: 0.9840
- F1@0.51: 0.9623
- Accuracy@0.51: 0.9667
- Precision@0.52: 0.9415
- Recall@0.52: 0.9840
- F1@0.52: 0.9623
- Accuracy@0.52: 0.9667
- Precision@0.53: 0.9415
- Recall@0.53: 0.9840
- F1@0.53: 0.9623
- Accuracy@0.53: 0.9667
- Precision@0.54: 0.9415
- Recall@0.54: 0.9840
- F1@0.54: 0.9623
- Accuracy@0.54: 0.9667
- Precision@0.55: 0.9415
- Recall@0.55: 0.9840
- F1@0.55: 0.9623
- Accuracy@0.55: 0.9667
- Precision@0.56: 0.9415
- Recall@0.56: 0.9839
- F1@0.56: 0.9622
- Accuracy@0.56: 0.9667
- Precision@0.57: 0.9416
- Recall@0.57: 0.9839
- F1@0.57: 0.9623
- Accuracy@0.57: 0.9667
- Precision@0.58: 0.9416
- Recall@0.58: 0.9839
- F1@0.58: 0.9623
- Accuracy@0.58: 0.9667
- Precision@0.59: 0.9417
- Recall@0.59: 0.9839
- F1@0.59: 0.9623
- Accuracy@0.59: 0.9668
- Precision@0.6: 0.9417
- Recall@0.6: 0.9838
- F1@0.6: 0.9623
- Accuracy@0.6: 0.9667
- Precision@0.61: 0.9417
- Recall@0.61: 0.9838
- F1@0.61: 0.9623
- Accuracy@0.61: 0.9667
- Precision@0.62: 0.9417
- Recall@0.62: 0.9837
- F1@0.62: 0.9622
- Accuracy@0.62: 0.9667
- Precision@0.63: 0.9418
- Recall@0.63: 0.9837
- F1@0.63: 0.9623
- Accuracy@0.63: 0.9667
- Precision@0.64: 0.9418
- Recall@0.64: 0.9835
- F1@0.64: 0.9622
- Accuracy@0.64: 0.9667
- Precision@0.65: 0.9421
- Recall@0.65: 0.9834
- F1@0.65: 0.9623
- Accuracy@0.65: 0.9668
- Precision@0.66: 0.9442
- Recall@0.66: 0.9827
- F1@0.66: 0.9631
- Accuracy@0.66: 0.9675
- Precision@0.67: 0.9462
- Recall@0.67: 0.9812
- F1@0.67: 0.9634
- Accuracy@0.67: 0.9678
- Precision@0.68: 0.9469
- Recall@0.68: 0.9806
- F1@0.68: 0.9634
- Accuracy@0.68: 0.9679
- Precision@0.69: 0.9477
- Recall@0.69: 0.9801
- F1@0.69: 0.9636
- Accuracy@0.69: 0.9681
- Precision@0.7: 0.9491
- Recall@0.7: 0.9794
- F1@0.7: 0.9640
- Accuracy@0.7: 0.9685
- Precision@0.71: 0.9504
- Recall@0.71: 0.9792
- F1@0.71: 0.9646
- Accuracy@0.71: 0.9690
- Precision@0.72: 0.9509
- Recall@0.72: 0.9787
- F1@0.72: 0.9646
- Accuracy@0.72: 0.9690
- Precision@0.73: 0.9523
- Recall@0.73: 0.9782
- F1@0.73: 0.9651
- Accuracy@0.73: 0.9695
- Precision@0.74: 0.9532
- Recall@0.74: 0.9775
- F1@0.74: 0.9652
- Accuracy@0.74: 0.9696
- Precision@0.75: 0.9538
- Recall@0.75: 0.9771
- F1@0.75: 0.9653
- Accuracy@0.75: 0.9697
- Precision@0.76: 0.9547
- Recall@0.76: 0.9767
- F1@0.76: 0.9656
- Accuracy@0.76: 0.9700
- Precision@0.77: 0.9552
- Recall@0.77: 0.9757
- F1@0.77: 0.9653
- Accuracy@0.77: 0.9698
- Precision@0.78: 0.9558
- Recall@0.78: 0.9752
- F1@0.78: 0.9654
- Accuracy@0.78: 0.9698
- Precision@0.79: 0.9564
- Recall@0.79: 0.9741
- F1@0.79: 0.9652
- Accuracy@0.79: 0.9697
- Precision@0.8: 0.9569
- Recall@0.8: 0.9737
- F1@0.8: 0.9652
- Accuracy@0.8: 0.9697
- Precision@0.81: 0.9574
- Recall@0.81: 0.9729
- F1@0.81: 0.9651
- Accuracy@0.81: 0.9697
- Precision@0.82: 0.9584
- Recall@0.82: 0.9726
- F1@0.82: 0.9654
- Accuracy@0.82: 0.9700
- Precision@0.83: 0.9591
- Recall@0.83: 0.9719
- F1@0.83: 0.9655
- Accuracy@0.83: 0.9700
- Precision@0.84: 0.9599
- Recall@0.84: 0.9714
- F1@0.84: 0.9656
- Accuracy@0.84: 0.9701
- Precision@0.85: 0.9605
- Recall@0.85: 0.9705
- F1@0.85: 0.9655
- Accuracy@0.85: 0.9701
- Precision@0.86: 0.9618
- Recall@0.86: 0.9693
- F1@0.86: 0.9656
- Accuracy@0.86: 0.9702
- Precision@0.87: 0.9625
- Recall@0.87: 0.9681
- F1@0.87: 0.9653
- Accuracy@0.87: 0.9700
- Precision@0.88: 0.9630
- Recall@0.88: 0.9670
- F1@0.88: 0.9650
- Accuracy@0.88: 0.9697
- Precision@0.89: 0.9644
- Recall@0.89: 0.9662
- F1@0.89: 0.9653
- Accuracy@0.89: 0.9700
- Precision@0.9: 0.9661
- Recall@0.9: 0.9649
- F1@0.9: 0.9655
- Accuracy@0.9: 0.9703
- Precision@0.91: 0.9666
- Recall@0.91: 0.9636
- F1@0.91: 0.9651
- Accuracy@0.91: 0.9699
- Precision@0.92: 0.9679
- Recall@0.92: 0.9612
- F1@0.92: 0.9645
- Accuracy@0.92: 0.9695
- Precision@0.93: 0.9691
- Recall@0.93: 0.9585
- F1@0.93: 0.9638
- Accuracy@0.93: 0.9689
- Precision@0.94: 0.9712
- Recall@0.94: 0.9550
- F1@0.94: 0.9630
- Accuracy@0.94: 0.9684
- Precision@0.95: 0.9736
- Recall@0.95: 0.9493
- F1@0.95: 0.9613
- Accuracy@0.95: 0.9670
- Precision@0.96: 0.9769
- Recall@0.96: 0.9414
- F1@0.96: 0.9588
- Accuracy@0.96: 0.9651
- Precision@0.97: 0.9805
- Recall@0.97: 0.9291
- F1@0.97: 0.9541
- Accuracy@0.97: 0.9615
- Precision@0.98: 0.9845
- Recall@0.98: 0.9055
- F1@0.98: 0.9434
- Accuracy@0.98: 0.9531
- Precision@0.99: 0.9891
- Recall@0.99: 0.8632
- F1@0.99: 0.9219
- Accuracy@0.99: 0.9369
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 3
Training results
Training Loss | Epoch | Step | Validation Loss | Precision@0.01 | Recall@0.01 | F1@0.01 | Accuracy@0.01 | Precision@0.02 | Recall@0.02 | F1@0.02 | Accuracy@0.02 | Precision@0.03 | Recall@0.03 | F1@0.03 | Accuracy@0.03 | Precision@0.04 | Recall@0.04 | F1@0.04 | Accuracy@0.04 | Precision@0.05 | Recall@0.05 | F1@0.05 | Accuracy@0.05 | Precision@0.06 | Recall@0.06 | F1@0.06 | Accuracy@0.06 | Precision@0.07 | Recall@0.07 | F1@0.07 | Accuracy@0.07 | Precision@0.08 | Recall@0.08 | F1@0.08 | Accuracy@0.08 | Precision@0.09 | Recall@0.09 | F1@0.09 | Accuracy@0.09 | Precision@0.1 | Recall@0.1 | F1@0.1 | Accuracy@0.1 | Precision@0.11 | Recall@0.11 | F1@0.11 | Accuracy@0.11 | Precision@0.12 | Recall@0.12 | F1@0.12 | Accuracy@0.12 | Precision@0.13 | Recall@0.13 | F1@0.13 | Accuracy@0.13 | Precision@0.14 | Recall@0.14 | F1@0.14 | Accuracy@0.14 | Precision@0.15 | Recall@0.15 | F1@0.15 | Accuracy@0.15 | Precision@0.16 | Recall@0.16 | F1@0.16 | Accuracy@0.16 | Precision@0.17 | Recall@0.17 | F1@0.17 | Accuracy@0.17 | Precision@0.18 | Recall@0.18 | F1@0.18 | Accuracy@0.18 | Precision@0.19 | Recall@0.19 | F1@0.19 | Accuracy@0.19 | Precision@0.2 | Recall@0.2 | F1@0.2 | Accuracy@0.2 | Precision@0.21 | Recall@0.21 | F1@0.21 | Accuracy@0.21 | Precision@0.22 | Recall@0.22 | F1@0.22 | Accuracy@0.22 | Precision@0.23 | Recall@0.23 | F1@0.23 | Accuracy@0.23 | Precision@0.24 | Recall@0.24 | F1@0.24 | Accuracy@0.24 | Precision@0.25 | Recall@0.25 | F1@0.25 | Accuracy@0.25 | Precision@0.26 | Recall@0.26 | F1@0.26 | Accuracy@0.26 | Precision@0.27 | Recall@0.27 | F1@0.27 | Accuracy@0.27 | Precision@0.28 | Recall@0.28 | F1@0.28 | Accuracy@0.28 | Precision@0.29 | Recall@0.29 | F1@0.29 | Accuracy@0.29 | Precision@0.3 | Recall@0.3 | F1@0.3 | Accuracy@0.3 | Precision@0.31 | Recall@0.31 | F1@0.31 | Accuracy@0.31 | Precision@0.32 | Recall@0.32 | F1@0.32 | Accuracy@0.32 | Precision@0.33 | Recall@0.33 | F1@0.33 | Accuracy@0.33 | Precision@0.34 | Recall@0.34 | F1@0.34 | Accuracy@0.34 | Precision@0.35 | Recall@0.35 | F1@0.35 | Accuracy@0.35 | Precision@0.36 | Recall@0.36 | F1@0.36 | Accuracy@0.36 | Precision@0.37 | Recall@0.37 | F1@0.37 | Accuracy@0.37 | Precision@0.38 | Recall@0.38 | F1@0.38 | Accuracy@0.38 | Precision@0.39 | Recall@0.39 | F1@0.39 | Accuracy@0.39 | Precision@0.4 | Recall@0.4 | F1@0.4 | Accuracy@0.4 | Precision@0.41 | Recall@0.41 | F1@0.41 | Accuracy@0.41 | Precision@0.42 | Recall@0.42 | F1@0.42 | Accuracy@0.42 | Precision@0.43 | Recall@0.43 | F1@0.43 | Accuracy@0.43 | Precision@0.44 | Recall@0.44 | F1@0.44 | Accuracy@0.44 | Precision@0.45 | Recall@0.45 | F1@0.45 | Accuracy@0.45 | Precision@0.46 | Recall@0.46 | F1@0.46 | Accuracy@0.46 | Precision@0.47 | Recall@0.47 | F1@0.47 | Accuracy@0.47 | Precision@0.48 | Recall@0.48 | F1@0.48 | Accuracy@0.48 | Precision@0.49 | Recall@0.49 | F1@0.49 | Accuracy@0.49 | Precision@0.5 | Recall@0.5 | F1@0.5 | Accuracy@0.5 | Precision@0.51 | Recall@0.51 | F1@0.51 | Accuracy@0.51 | Precision@0.52 | Recall@0.52 | F1@0.52 | Accuracy@0.52 | Precision@0.53 | Recall@0.53 | F1@0.53 | Accuracy@0.53 | Precision@0.54 | Recall@0.54 | F1@0.54 | Accuracy@0.54 | Precision@0.55 | Recall@0.55 | F1@0.55 | Accuracy@0.55 | Precision@0.56 | Recall@0.56 | F1@0.56 | Accuracy@0.56 | Precision@0.57 | Recall@0.57 | F1@0.57 | Accuracy@0.57 | Precision@0.58 | Recall@0.58 | F1@0.58 | Accuracy@0.58 | Precision@0.59 | Recall@0.59 | F1@0.59 | Accuracy@0.59 | Precision@0.6 | Recall@0.6 | F1@0.6 | Accuracy@0.6 | Precision@0.61 | Recall@0.61 | F1@0.61 | Accuracy@0.61 | Precision@0.62 | Recall@0.62 | F1@0.62 | Accuracy@0.62 | Precision@0.63 | Recall@0.63 | F1@0.63 | Accuracy@0.63 | Precision@0.64 | Recall@0.64 | F1@0.64 | Accuracy@0.64 | Precision@0.65 | Recall@0.65 | F1@0.65 | Accuracy@0.65 | Precision@0.66 | Recall@0.66 | F1@0.66 | Accuracy@0.66 | Precision@0.67 | Recall@0.67 | F1@0.67 | Accuracy@0.67 | Precision@0.68 | Recall@0.68 | F1@0.68 | Accuracy@0.68 | Precision@0.69 | Recall@0.69 | F1@0.69 | Accuracy@0.69 | Precision@0.7 | Recall@0.7 | F1@0.7 | Accuracy@0.7 | Precision@0.71 | Recall@0.71 | F1@0.71 | Accuracy@0.71 | Precision@0.72 | Recall@0.72 | F1@0.72 | Accuracy@0.72 | Precision@0.73 | Recall@0.73 | F1@0.73 | Accuracy@0.73 | Precision@0.74 | Recall@0.74 | F1@0.74 | Accuracy@0.74 | Precision@0.75 | Recall@0.75 | F1@0.75 | Accuracy@0.75 | Precision@0.76 | Recall@0.76 | F1@0.76 | Accuracy@0.76 | Precision@0.77 | Recall@0.77 | F1@0.77 | Accuracy@0.77 | Precision@0.78 | Recall@0.78 | F1@0.78 | Accuracy@0.78 | Precision@0.79 | Recall@0.79 | F1@0.79 | Accuracy@0.79 | Precision@0.8 | Recall@0.8 | F1@0.8 | Accuracy@0.8 | Precision@0.81 | Recall@0.81 | F1@0.81 | Accuracy@0.81 | Precision@0.82 | Recall@0.82 | F1@0.82 | Accuracy@0.82 | Precision@0.83 | Recall@0.83 | F1@0.83 | Accuracy@0.83 | Precision@0.84 | Recall@0.84 | F1@0.84 | Accuracy@0.84 | Precision@0.85 | Recall@0.85 | F1@0.85 | Accuracy@0.85 | Precision@0.86 | Recall@0.86 | F1@0.86 | Accuracy@0.86 | Precision@0.87 | Recall@0.87 | F1@0.87 | Accuracy@0.87 | Precision@0.88 | Recall@0.88 | F1@0.88 | Accuracy@0.88 | Precision@0.89 | Recall@0.89 | F1@0.89 | Accuracy@0.89 | Precision@0.9 | Recall@0.9 | F1@0.9 | Accuracy@0.9 | Precision@0.91 | Recall@0.91 | F1@0.91 | Accuracy@0.91 | Precision@0.92 | Recall@0.92 | F1@0.92 | Accuracy@0.92 | Precision@0.93 | Recall@0.93 | F1@0.93 | Accuracy@0.93 | Precision@0.94 | Recall@0.94 | F1@0.94 | Accuracy@0.94 | Precision@0.95 | Recall@0.95 | F1@0.95 | Accuracy@0.95 | Precision@0.96 | Recall@0.96 | F1@0.96 | Accuracy@0.96 | Precision@0.97 | Recall@0.97 | F1@0.97 | Accuracy@0.97 | Precision@0.98 | Recall@0.98 | F1@0.98 | Accuracy@0.98 | Precision@0.99 | Recall@0.99 | F1@0.99 | Accuracy@0.99 |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0.6601 | 1.0 | 2436 | 0.6623 | 0.4313 | 1.0 | 0.6027 | 0.4313 | 0.4313 | 1.0 | 0.6027 | 0.4313 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 |
0.2161 | 2.0 | 4872 | 0.3014 | 0.4313 | 1.0 | 0.6027 | 0.4313 | 0.4313 | 1.0 | 0.6027 | 0.4313 | 0.4313 | 1.0 | 0.6027 | 0.4313 | 0.8328 | 0.9731 | 0.8975 | 0.9042 | 0.8331 | 0.9722 | 0.8973 | 0.9040 | 0.8332 | 0.9722 | 0.8974 | 0.9041 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9720 | 0.8973 | 0.9040 | 0.8333 | 0.9720 | 0.8973 | 0.9040 | 0.8333 | 0.9720 | 0.8973 | 0.9040 | 0.8333 | 0.9720 | 0.8973 | 0.9040 | 0.8333 | 0.9720 | 0.8973 | 0.9040 | 0.8333 | 0.9720 | 0.8973 | 0.9040 | 0.8333 | 0.9720 | 0.8973 | 0.9040 | 0.8333 | 0.9720 | 0.8973 | 0.9040 | 0.8333 | 0.9720 | 0.8973 | 0.9040 | 0.8333 | 0.9720 | 0.8973 | 0.9040 | 0.8333 | 0.9720 | 0.8973 | 0.9040 | 0.8332 | 0.9719 | 0.8972 | 0.9040 | 0.8332 | 0.9719 | 0.8972 | 0.9040 | 0.8332 | 0.9719 | 0.8972 | 0.9040 | 0.8332 | 0.9718 | 0.8972 | 0.9039 | 0.8333 | 0.9716 | 0.8971 | 0.9039 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 |
0.0569 | 3.0 | 7308 | 0.1032 | 0.8166 | 0.9988 | 0.8985 | 0.9027 | 0.8430 | 0.9980 | 0.9140 | 0.9190 | 0.8560 | 0.9973 | 0.9213 | 0.9265 | 0.8666 | 0.9966 | 0.9271 | 0.9324 | 0.8740 | 0.9960 | 0.9310 | 0.9364 | 0.8810 | 0.9959 | 0.9350 | 0.9402 | 0.8870 | 0.9950 | 0.9379 | 0.9431 | 0.8914 | 0.9950 | 0.9404 | 0.9456 | 0.8959 | 0.9947 | 0.9427 | 0.9479 | 0.9000 | 0.9942 | 0.9448 | 0.9499 | 0.9031 | 0.9938 | 0.9463 | 0.9513 | 0.9066 | 0.9931 | 0.9479 | 0.9529 | 0.9089 | 0.9929 | 0.9491 | 0.9540 | 0.9108 | 0.9923 | 0.9498 | 0.9548 | 0.9138 | 0.9918 | 0.9512 | 0.9561 | 0.9163 | 0.9913 | 0.9524 | 0.9572 | 0.9184 | 0.9912 | 0.9534 | 0.9582 | 0.9206 | 0.9908 | 0.9544 | 0.9592 | 0.9218 | 0.9902 | 0.9548 | 0.9596 | 0.9231 | 0.9899 | 0.9553 | 0.9601 | 0.9244 | 0.9896 | 0.9559 | 0.9606 | 0.9257 | 0.9893 | 0.9564 | 0.9611 | 0.9268 | 0.9890 | 0.9569 | 0.9615 | 0.9274 | 0.9889 | 0.9571 | 0.9618 | 0.9284 | 0.9884 | 0.9575 | 0.9621 | 0.9293 | 0.9882 | 0.9579 | 0.9625 | 0.9302 | 0.9881 | 0.9583 | 0.9629 | 0.9310 | 0.9881 | 0.9587 | 0.9633 | 0.9320 | 0.9879 | 0.9591 | 0.9637 | 0.9331 | 0.9876 | 0.9596 | 0.9641 | 0.9338 | 0.9875 | 0.9599 | 0.9644 | 0.9346 | 0.9874 | 0.9603 | 0.9648 | 0.9353 | 0.9872 | 0.9605 | 0.9650 | 0.9357 | 0.9872 | 0.9608 | 0.9652 | 0.9364 | 0.9871 | 0.9611 | 0.9655 | 0.9367 | 0.9869 | 0.9612 | 0.9656 | 0.9373 | 0.9867 | 0.9613 | 0.9658 | 0.9380 | 0.9863 | 0.9616 | 0.9660 | 0.9388 | 0.9860 | 0.9618 | 0.9662 | 0.9397 | 0.9852 | 0.9619 | 0.9663 | 0.9408 | 0.9850 | 0.9624 | 0.9668 | 0.9414 | 0.9849 | 0.9627 | 0.9671 | 0.9414 | 0.9840 | 0.9623 | 0.9667 | 0.9415 | 0.9840 | 0.9623 | 0.9667 | 0.9415 | 0.9840 | 0.9623 | 0.9667 | 0.9415 | 0.9840 | 0.9623 | 0.9667 | 0.9415 | 0.9840 | 0.9623 | 0.9667 | 0.9415 | 0.9840 | 0.9623 | 0.9667 | 0.9415 | 0.9840 | 0.9623 | 0.9667 | 0.9415 | 0.9840 | 0.9623 | 0.9667 | 0.9415 | 0.9840 | 0.9623 | 0.9667 | 0.9415 | 0.9840 | 0.9623 | 0.9667 | 0.9415 | 0.9840 | 0.9623 | 0.9667 | 0.9415 | 0.9840 | 0.9623 | 0.9667 | 0.9415 | 0.9840 | 0.9623 | 0.9667 | 0.9415 | 0.9839 | 0.9622 | 0.9667 | 0.9416 | 0.9839 | 0.9623 | 0.9667 | 0.9416 | 0.9839 | 0.9623 | 0.9667 | 0.9417 | 0.9839 | 0.9623 | 0.9668 | 0.9417 | 0.9838 | 0.9623 | 0.9667 | 0.9417 | 0.9838 | 0.9623 | 0.9667 | 0.9417 | 0.9837 | 0.9622 | 0.9667 | 0.9418 | 0.9837 | 0.9623 | 0.9667 | 0.9418 | 0.9835 | 0.9622 | 0.9667 | 0.9421 | 0.9834 | 0.9623 | 0.9668 | 0.9442 | 0.9827 | 0.9631 | 0.9675 | 0.9462 | 0.9812 | 0.9634 | 0.9678 | 0.9469 | 0.9806 | 0.9634 | 0.9679 | 0.9477 | 0.9801 | 0.9636 | 0.9681 | 0.9491 | 0.9794 | 0.9640 | 0.9685 | 0.9504 | 0.9792 | 0.9646 | 0.9690 | 0.9509 | 0.9787 | 0.9646 | 0.9690 | 0.9523 | 0.9782 | 0.9651 | 0.9695 | 0.9532 | 0.9775 | 0.9652 | 0.9696 | 0.9538 | 0.9771 | 0.9653 | 0.9697 | 0.9547 | 0.9767 | 0.9656 | 0.9700 | 0.9552 | 0.9757 | 0.9653 | 0.9698 | 0.9558 | 0.9752 | 0.9654 | 0.9698 | 0.9564 | 0.9741 | 0.9652 | 0.9697 | 0.9569 | 0.9737 | 0.9652 | 0.9697 | 0.9574 | 0.9729 | 0.9651 | 0.9697 | 0.9584 | 0.9726 | 0.9654 | 0.9700 | 0.9591 | 0.9719 | 0.9655 | 0.9700 | 0.9599 | 0.9714 | 0.9656 | 0.9701 | 0.9605 | 0.9705 | 0.9655 | 0.9701 | 0.9618 | 0.9693 | 0.9656 | 0.9702 | 0.9625 | 0.9681 | 0.9653 | 0.9700 | 0.9630 | 0.9670 | 0.9650 | 0.9697 | 0.9644 | 0.9662 | 0.9653 | 0.9700 | 0.9661 | 0.9649 | 0.9655 | 0.9703 | 0.9666 | 0.9636 | 0.9651 | 0.9699 | 0.9679 | 0.9612 | 0.9645 | 0.9695 | 0.9691 | 0.9585 | 0.9638 | 0.9689 | 0.9712 | 0.9550 | 0.9630 | 0.9684 | 0.9736 | 0.9493 | 0.9613 | 0.9670 | 0.9769 | 0.9414 | 0.9588 | 0.9651 | 0.9805 | 0.9291 | 0.9541 | 0.9615 | 0.9845 | 0.9055 | 0.9434 | 0.9531 | 0.9891 | 0.8632 | 0.9219 | 0.9369 |
Framework versions
- Transformers 4.48.0.dev0
- Pytorch 2.5.1+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0
- Downloads last month
- 5
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for Kyle1668/allenai-scibert_scivocab_uncased_20241230-091934
Base model
allenai/scibert_scivocab_uncased