umutarpayy/kimya11_bert

This model is a fine-tuned version of dbmdz/bert-base-turkish-cased on an unknown dataset. It achieves the following results on the evaluation set:

  • Train Loss: 0.1622
  • Train Accuracy: 0.9420
  • Validation Loss: 0.1268
  • Validation Accuracy: 0.9541
  • Epoch: 11

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': {'module': 'transformers.optimization_tf', 'class_name': 'WarmUp', 'config': {'initial_learning_rate': 3e-05, 'decay_schedule_fn': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 3e-05, 'decay_steps': 6318, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'warmup_steps': 702, 'power': 1.0, 'name': None}, 'registered_name': 'WarmUp'}, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}
  • training_precision: float32

Training results

Train Loss Train Accuracy Validation Loss Validation Accuracy Epoch
2.5503 0.3053 1.1854 0.6154 0
1.0395 0.6669 0.7700 0.7340 1
0.7644 0.7450 0.6048 0.7853 2
0.6294 0.7893 0.5001 0.8269 3
0.5274 0.8217 0.4076 0.8526 4
0.4479 0.8491 0.3462 0.8857 5
0.3747 0.8693 0.2588 0.9167 6
0.3143 0.8908 0.2174 0.9274 7
0.2631 0.9072 0.1678 0.9412 8
0.2238 0.9209 0.1431 0.9530 9
0.1859 0.9344 0.1373 0.9487 10
0.1622 0.9420 0.1268 0.9541 11

Framework versions

  • Transformers 4.48.3
  • TensorFlow 2.18.0
  • Datasets 3.4.1
  • Tokenizers 0.21.1
Downloads last month
5
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for umutarpayy/kimya11_bert

Finetuned
(182)
this model