detr_finetuned_bccd
This model is a fine-tuned version of microsoft/conditional-detr-resnet-50 on the generator dataset. It achieves the following results on the evaluation set:
- Loss: 0.5863
- Map: 0.5535
- Map 50: 0.823
- Map 75: 0.6013
- Map Small: -1.0
- Map Medium: 0.3472
- Map Large: 0.638
- Mar 1: 0.4031
- Mar 10: 0.6432
- Mar 100: 0.7115
- Mar Small: -1.0
- Mar Medium: 0.542
- Mar Large: 0.73
- Map Platelets: 0.3468
- Mar 100 Platelets: 0.5444
- Map Rbc: 0.5782
- Mar 100 Rbc: 0.75
- Map Wbc: 0.7356
- Mar 100 Wbc: 0.84
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 30
Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Platelets | Mar 100 Platelets | Map Rbc | Mar 100 Rbc | Map Wbc | Mar 100 Wbc |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| No log | 1.0 | 26 | 0.9642 | 0.1078 | 0.177 | 0.1243 | -1.0 | 0.0 | 0.113 | 0.0193 | 0.1197 | 0.2165 | -1.0 | 0.0 | 0.2165 | 0.0 | 0.0 | 0.3235 | 0.6496 | 0.0 | 0.0 |
| No log | 2.0 | 52 | 0.9589 | 0.1277 | 0.2441 | 0.1207 | -1.0 | 0.0405 | 0.1148 | 0.0413 | 0.1585 | 0.2468 | -1.0 | 0.1101 | 0.2116 | 0.0391 | 0.1056 | 0.3441 | 0.6349 | 0.0 | 0.0 |
| No log | 3.0 | 78 | 0.8392 | 0.2048 | 0.363 | 0.2117 | -1.0 | 0.111 | 0.2378 | 0.1376 | 0.459 | 0.6004 | -1.0 | 0.4609 | 0.5277 | 0.1078 | 0.4514 | 0.405 | 0.6685 | 0.1018 | 0.6812 |
| No log | 4.0 | 104 | 0.7755 | 0.3901 | 0.6199 | 0.436 | -1.0 | 0.149 | 0.4846 | 0.3071 | 0.5539 | 0.6514 | -1.0 | 0.4754 | 0.6805 | 0.1471 | 0.4792 | 0.4421 | 0.6862 | 0.581 | 0.7887 |
| No log | 5.0 | 130 | 0.7384 | 0.4471 | 0.7188 | 0.488 | -1.0 | 0.1917 | 0.5099 | 0.3475 | 0.5664 | 0.6504 | -1.0 | 0.4507 | 0.7194 | 0.1887 | 0.4597 | 0.4747 | 0.694 | 0.678 | 0.7975 |
| No log | 6.0 | 156 | 0.7484 | 0.4525 | 0.7284 | 0.4858 | -1.0 | 0.2084 | 0.5322 | 0.3549 | 0.5697 | 0.6437 | -1.0 | 0.4449 | 0.7039 | 0.2053 | 0.4528 | 0.4713 | 0.6784 | 0.681 | 0.8 |
| No log | 7.0 | 182 | 0.7283 | 0.4352 | 0.7337 | 0.4859 | -1.0 | 0.1647 | 0.4529 | 0.3455 | 0.5486 | 0.6284 | -1.0 | 0.4 | 0.6178 | 0.1618 | 0.3986 | 0.4761 | 0.688 | 0.6677 | 0.7987 |
| No log | 8.0 | 208 | 0.7011 | 0.4873 | 0.7735 | 0.527 | -1.0 | 0.2589 | 0.5811 | 0.3658 | 0.6008 | 0.6773 | -1.0 | 0.5058 | 0.7181 | 0.2553 | 0.5111 | 0.5051 | 0.7121 | 0.7015 | 0.8087 |
| No log | 9.0 | 234 | 0.6765 | 0.482 | 0.757 | 0.5214 | -1.0 | 0.257 | 0.5759 | 0.3619 | 0.5998 | 0.6877 | -1.0 | 0.5507 | 0.7247 | 0.2518 | 0.5556 | 0.5037 | 0.7212 | 0.6903 | 0.7862 |
| No log | 10.0 | 260 | 0.6977 | 0.4837 | 0.7833 | 0.5141 | -1.0 | 0.2669 | 0.5117 | 0.373 | 0.5761 | 0.654 | -1.0 | 0.4377 | 0.6952 | 0.2618 | 0.4431 | 0.4919 | 0.7077 | 0.6973 | 0.8112 |
| No log | 11.0 | 286 | 0.6463 | 0.5015 | 0.7766 | 0.552 | -1.0 | 0.2568 | 0.5259 | 0.3789 | 0.5999 | 0.6817 | -1.0 | 0.5043 | 0.6271 | 0.2527 | 0.4972 | 0.54 | 0.7416 | 0.7118 | 0.8062 |
| No log | 12.0 | 312 | 0.6382 | 0.5109 | 0.7939 | 0.5519 | -1.0 | 0.275 | 0.565 | 0.3855 | 0.612 | 0.6902 | -1.0 | 0.5101 | 0.7083 | 0.2719 | 0.5125 | 0.5403 | 0.7307 | 0.7206 | 0.8275 |
| No log | 13.0 | 338 | 0.6360 | 0.504 | 0.7943 | 0.5412 | -1.0 | 0.2616 | 0.562 | 0.3762 | 0.6118 | 0.6923 | -1.0 | 0.5188 | 0.7076 | 0.2602 | 0.5208 | 0.5406 | 0.7311 | 0.7111 | 0.825 |
| No log | 14.0 | 364 | 0.6422 | 0.5205 | 0.8 | 0.5632 | -1.0 | 0.305 | 0.6082 | 0.3915 | 0.6112 | 0.6849 | -1.0 | 0.5058 | 0.715 | 0.3038 | 0.5097 | 0.5381 | 0.7212 | 0.7197 | 0.8238 |
| No log | 15.0 | 390 | 0.7001 | 0.4877 | 0.7964 | 0.5097 | -1.0 | 0.3002 | 0.546 | 0.3682 | 0.5841 | 0.6589 | -1.0 | 0.513 | 0.6866 | 0.2976 | 0.5167 | 0.4969 | 0.6761 | 0.6685 | 0.7837 |
| No log | 16.0 | 416 | 0.6330 | 0.5173 | 0.7955 | 0.5706 | -1.0 | 0.3087 | 0.572 | 0.3811 | 0.6138 | 0.6855 | -1.0 | 0.5304 | 0.6651 | 0.3025 | 0.5278 | 0.5454 | 0.7236 | 0.7038 | 0.805 |
| No log | 17.0 | 442 | 0.6013 | 0.5356 | 0.8156 | 0.5856 | -1.0 | 0.3084 | 0.6229 | 0.3936 | 0.6348 | 0.7128 | -1.0 | 0.5565 | 0.7586 | 0.3051 | 0.5625 | 0.5594 | 0.7396 | 0.7423 | 0.8363 |
| No log | 18.0 | 468 | 0.6173 | 0.5414 | 0.8115 | 0.5947 | -1.0 | 0.3382 | 0.6081 | 0.4042 | 0.6367 | 0.7073 | -1.0 | 0.5435 | 0.736 | 0.3342 | 0.5472 | 0.5528 | 0.7334 | 0.7371 | 0.8413 |
| No log | 19.0 | 494 | 0.5997 | 0.5368 | 0.8061 | 0.5793 | -1.0 | 0.3236 | 0.641 | 0.3981 | 0.6291 | 0.7038 | -1.0 | 0.5319 | 0.7575 | 0.3217 | 0.5389 | 0.564 | 0.7474 | 0.7246 | 0.825 |
| 0.7657 | 20.0 | 520 | 0.5929 | 0.5384 | 0.8159 | 0.5793 | -1.0 | 0.3278 | 0.6412 | 0.3973 | 0.6361 | 0.7103 | -1.0 | 0.5449 | 0.7386 | 0.3258 | 0.5486 | 0.5732 | 0.7524 | 0.716 | 0.83 |
| 0.7657 | 21.0 | 546 | 0.5932 | 0.5403 | 0.8167 | 0.5962 | -1.0 | 0.3445 | 0.6262 | 0.401 | 0.6382 | 0.7129 | -1.0 | 0.5623 | 0.7249 | 0.3424 | 0.5639 | 0.571 | 0.7485 | 0.7074 | 0.8263 |
| 0.7657 | 22.0 | 572 | 0.5931 | 0.5422 | 0.82 | 0.5838 | -1.0 | 0.3427 | 0.6289 | 0.3984 | 0.6363 | 0.7068 | -1.0 | 0.5362 | 0.7272 | 0.3403 | 0.5389 | 0.5722 | 0.7478 | 0.7141 | 0.8338 |
| 0.7657 | 23.0 | 598 | 0.5896 | 0.5454 | 0.8141 | 0.6003 | -1.0 | 0.3437 | 0.6427 | 0.4001 | 0.6333 | 0.7036 | -1.0 | 0.5304 | 0.7365 | 0.3421 | 0.5347 | 0.5739 | 0.7486 | 0.7202 | 0.8275 |
| 0.7657 | 24.0 | 624 | 0.5918 | 0.5473 | 0.8161 | 0.5912 | -1.0 | 0.3487 | 0.61 | 0.4027 | 0.6445 | 0.7132 | -1.0 | 0.5551 | 0.7275 | 0.3462 | 0.5569 | 0.5733 | 0.7464 | 0.7225 | 0.8363 |
| 0.7657 | 25.0 | 650 | 0.5874 | 0.5496 | 0.8117 | 0.6142 | -1.0 | 0.3503 | 0.6333 | 0.4026 | 0.6448 | 0.7145 | -1.0 | 0.558 | 0.7279 | 0.3494 | 0.5597 | 0.5751 | 0.7487 | 0.7244 | 0.835 |
| 0.7657 | 26.0 | 676 | 0.5887 | 0.5453 | 0.8162 | 0.6013 | -1.0 | 0.3375 | 0.6329 | 0.4004 | 0.6365 | 0.7085 | -1.0 | 0.5391 | 0.7279 | 0.3376 | 0.5417 | 0.5754 | 0.7476 | 0.723 | 0.8363 |
| 0.7657 | 27.0 | 702 | 0.5881 | 0.5518 | 0.8205 | 0.6036 | -1.0 | 0.3489 | 0.6358 | 0.4014 | 0.6438 | 0.7113 | -1.0 | 0.5478 | 0.728 | 0.3483 | 0.55 | 0.5777 | 0.7489 | 0.7293 | 0.835 |
| 0.7657 | 28.0 | 728 | 0.5865 | 0.5531 | 0.8225 | 0.6003 | -1.0 | 0.3494 | 0.6369 | 0.4022 | 0.6443 | 0.7122 | -1.0 | 0.5478 | 0.7289 | 0.3487 | 0.55 | 0.5784 | 0.7504 | 0.7321 | 0.8363 |
| 0.7657 | 29.0 | 754 | 0.5864 | 0.5532 | 0.8226 | 0.6009 | -1.0 | 0.3464 | 0.6379 | 0.4031 | 0.6431 | 0.7114 | -1.0 | 0.542 | 0.73 | 0.3462 | 0.5444 | 0.5779 | 0.7499 | 0.7356 | 0.84 |
| 0.7657 | 30.0 | 780 | 0.5863 | 0.5535 | 0.823 | 0.6013 | -1.0 | 0.3472 | 0.638 | 0.4031 | 0.6432 | 0.7115 | -1.0 | 0.542 | 0.73 | 0.3468 | 0.5444 | 0.5782 | 0.75 | 0.7356 | 0.84 |
Framework versions
- Transformers 4.56.1
- Pytorch 2.8.0+cu126
- Datasets 4.0.0
- Tokenizers 0.22.0
- Downloads last month
- 19
Model tree for Punn1403/detr_finetuned_bccd
Base model
microsoft/conditional-detr-resnet-50