detr_finetuned_cppe5_comparison

This model is a fine-tuned version of Jesse020202/cppe5_setup_on_roadsign_test on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.3939

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine
  • num_epochs: 20

Training results

Training Loss Epoch Step Validation Loss
No log 1.0 107 1.6000
No log 2.0 214 1.5764
No log 3.0 321 1.5325
No log 4.0 428 1.5211
1.3796 5.0 535 1.5157
1.3796 6.0 642 1.5043
1.3796 7.0 749 1.4820
1.3796 8.0 856 1.4862
1.3796 9.0 963 1.4591
1.2857 10.0 1070 1.4494
1.2857 11.0 1177 1.4363
1.2857 12.0 1284 1.4385
1.2857 13.0 1391 1.4152
1.2857 14.0 1498 1.4067
1.1868 15.0 1605 1.4171
1.1868 16.0 1712 1.3954
1.1868 17.0 1819 1.3961
1.1868 18.0 1926 1.3974
1.1187 19.0 2033 1.3943
1.1187 20.0 2140 1.3939

Framework versions

  • Transformers 4.56.2
  • Pytorch 2.8.0
  • Datasets 4.1.1
  • Tokenizers 0.22.1
Downloads last month
4
Safetensors
Model size
43.5M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for Jesse020202/detr_finetuned_cppe5_comparison

Evaluation results