Training big and small objects

Hey everyone,

I’m working on an object detection task and the main challenge is that the object can appear very close to the camera (large in the image) or far away (small in the image), so the scale varies significantly.

The dataset structured is as follows:

  • Train Set: 6,500 images (69%)
  • Validation Set: 1,960 images (21%)
  • Test Set: 950 images (10%)
  • It includes only 1 class

What I’ve tried in the dataset:

  • Resizing images to larger dimensions – hoping to improve detection of small objects.
  • Tiling (splitting images into smaller patches) – helps small object detection, but hurts performance on large/close objects.
  • Augmentations – mostly rotations and flips.
  • Regularization – using dropout and weight_decay for better generalization.

The problem:

Despite these efforts, I’m not seeing good validation loss metrics, and the model performs poorly on real-world test data. Validation metrics stagnates and seems not to reflect improvements on the training set. It’s as if the model isn’t generalizing well at all.

I had set training for 300 epochs, but I stopped early at epoch 82 because I wasn’t seeing meaningful improvements anymore. You can see the training and validation results below for more context.

Has anyone encountered something similar or have suggestions for better strategies? Or am I possibly stopping too early, should I just let it train for more epochs?

Appreciate any insights!

epoch time train/box_loss train/cls_loss train/dfl_loss metrics/precision(B) metrics/recall(B) metrics/mAP50(B) metrics/mAP50-95(B) val/box_loss val/cls_loss val/dfl_loss lr/pg0 lr/pg1 lr/pg2
1 522.216 1.21484 1.96486 1.12836 0.67687 0.60632 0.60813 0.26655 1.8101 3.08611 1.63794 0.00332922 0.00332922 0.00332922
2 1022.91 1.31218 1.22273 1.16458 0.78689 0.56853 0.63617 0.30262 1.83403 2.32352 1.57457 0.00664057 0.00664057 0.00664057
3 1518.51 1.55511 1.5925 1.31143 0.52895 0.27524 0.28172 0.12067 2.21075 4.57341 1.86992 0.00992991 0.00992991 0.00992991
4 2011.98 1.63137 1.667 1.39088 0.593 0.4247 0.45502 0.19556 2.05294 4.70136 1.77714 0.009901 0.009901 0.009901
5 2505.4 1.53401 1.47178 1.34638 0.61787 0.34518 0.39068 0.19191 2.03984 4.36694 1.90596 0.009868 0.009868 0.009868
6 2998.91 1.42394 1.29496 1.28213 0.68681 0.44219 0.50266 0.24261 1.83279 4.3846 1.71185 0.009835 0.009835 0.009835
7 3492.2 1.35369 1.20036 1.25799 0.64791 0.44941 0.48116 0.24485 1.97512 3.34732 1.91115 0.009802 0.009802 0.009802
8 3985.68 1.32409 1.13134 1.23312 0.77556 0.55936 0.61576 0.32805 1.68276 3.45275 1.59607 0.009769 0.009769 0.009769
9 4479.64 1.27902 1.08883 1.20429 0.7403 0.50367 0.57553 0.31498 1.6512 2.93583 1.56629 0.009736 0.009736 0.009736
10 4973.09 1.23218 1.00286 1.18737 0.78928 0.63168 0.69717 0.36714 1.66841 2.86489 1.5403 0.009703 0.009703 0.009703
11 5465.84 1.19959 0.96516 1.17394 0.8042 0.50874 0.5634 0.30076 1.88365 3.23127 1.77041 0.00967 0.00967 0.00967
12 5957.64 1.17893 0.92533 1.15081 0.7549 0.45347 0.53177 0.2801 1.87136 3.6947 1.73355 0.009637 0.009637 0.009637
13 6449.43 1.16446 0.9061 1.15364 0.77716 0.62098 0.68487 0.37692 1.62054 3.75686 1.553 0.009604 0.009604 0.009604
14 6941.33 1.14921 0.87996 1.14806 0.65501 0.54399 0.58923 0.33479 1.67707 2.8918 1.58345 0.009571 0.009571 0.009571
15 7433.11 1.12833 0.85889 1.12673 0.80677 0.50197 0.59101 0.33272 1.85315 2.65743 1.83476 0.009538 0.009538 0.009538
16 7925.19 1.12967 0.84466 1.12905 0.81164 0.54597 0.63194 0.36942 1.64185 2.25634 1.55806 0.009505 0.009505 0.009505
17 8418.02 1.11412 0.82857 1.12127 0.73095 0.51213 0.60409 0.33054 1.76379 2.91843 1.66224 0.009472 0.009472 0.009472
18 8910.88 1.09749 0.80709 1.11778 0.81209 0.57304 0.66994 0.38392 1.64257 2.87659 1.565 0.009439 0.009439 0.009439
19 9403.89 1.09755 0.80143 1.11697 0.80558 0.60688 0.68473 0.37299 1.69649 2.91798 1.62768 0.009406 0.009406 0.009406
20 9896.51 1.0548 0.7529 1.09096 0.76185 0.62972 0.68899 0.37983 1.66117 2.34526 1.56234 0.009373 0.009373 0.009373
21 10389.3 1.06606 0.75912 1.09768 0.81032 0.64862 0.71179 0.39808 1.64587 2.12032 1.61688 0.00934 0.00934 0.00934
22 10882.4 1.06216 0.76017 1.08806 0.73683 0.55443 0.63003 0.35298 1.73383 2.93674 1.63288 0.009307 0.009307 0.009307
23 11375.4 1.04865 0.73757 1.0902 0.78834 0.53187 0.60759 0.346 1.83557 2.54188 1.81177 0.009274 0.009274 0.009274
24 11867.9 1.02845 0.72646 1.07333 0.80087 0.66779 0.7146 0.38569 1.69118 2.33219 1.64683 0.009241 0.009241 0.009241
25 12360 1.02271 0.72385 1.07307 0.8251 0.65257 0.7358 0.42724 1.53099 2.20869 1.46722 0.009208 0.009208 0.009208
26 12852 1.02543 0.72413 1.07426 0.84572 0.67403 0.74567 0.39096 1.64241 2.46562 1.59911 0.009175 0.009175 0.009175
27 13344.1 1.0194 0.70401 1.0735 0.84892 0.68773 0.76908 0.42751 1.5688 2.23817 1.57047 0.009142 0.009142 0.009142
28 13836.1 1.00288 0.70374 1.06356 0.79053 0.63734 0.69186 0.37102 1.70804 2.33965 1.65748 0.009109 0.009109 0.009109
29 14328.1 1.0016 0.68263 1.06407 0.82787 0.63959 0.70614 0.38131 1.72777 2.78227 1.69889 0.009076 0.009076 0.009076
30 14820.1 0.99761 0.68556 1.06256 0.86947 0.68077 0.76394 0.41094 1.60313 2.24022 1.55327 0.009043 0.009043 0.009043
31 15311.9 0.98418 0.6723 1.04759 0.81426 0.59096 0.67857 0.37882 1.6873 2.42712 1.63534 0.00901 0.00901 0.00901
32 15803.9 0.98234 0.65655 1.05422 0.82178 0.6317 0.71356 0.40306 1.63952 2.56835 1.58867 0.008977 0.008977 0.008977
33 16295.7 0.97831 0.65267 1.0566 0.78994 0.64072 0.70264 0.38265 1.69188 2.18085 1.63365 0.008944 0.008944 0.008944
34 16787.5 0.96886 0.65189 1.04718 0.83129 0.62042 0.70442 0.40182 1.58603 2.03168 1.53025 0.008911 0.008911 0.008911
35 17279.5 0.96771 0.6377 1.04484 0.83411 0.7022 0.7732 0.43269 1.55718 1.9772 1.53043 0.008878 0.008878 0.008878
36 17771.5 0.95995 0.62991 1.0376 0.84028 0.70164 0.75636 0.4244 1.56176 1.74305 1.56389 0.008845 0.008845 0.008845
37 18263.5 0.97016 0.63313 1.04857 0.8233 0.7084 0.76562 0.42681 1.60227 1.79265 1.57748 0.008812 0.008812 0.008812
38 18755.3 0.95005 0.62194 1.03446 0.79285 0.61307 0.68154 0.378 1.74077 2.1006 1.69088 0.008779 0.008779 0.008779
39 19247.3 0.95267 0.61826 1.04004 0.79361 0.63979 0.68681 0.39442 1.60984 1.9887 1.60588 0.008746 0.008746 0.008746
40 19739.1 0.9507 0.62827 1.03624 0.85862 0.58827 0.70695 0.38816 1.73864 1.88352 1.70384 0.008713 0.008713 0.008713
41 20231.2 0.93861 0.60769 1.03277 0.81003 0.61086 0.67632 0.39489 1.62025 2.48937 1.58749 0.00868 0.00868 0.00868
42 20723.1 0.92736 0.59971 1.0277 0.86736 0.71461 0.79122 0.45333 1.51992 1.86183 1.53798 0.008647 0.008647 0.008647
43 21215.4 0.93979 0.60931 1.03236 0.83262 0.70421 0.76373 0.44253 1.5401 2.1002 1.56088 0.008614 0.008614 0.008614
44 21707.4 0.93311 0.60582 1.02705 0.81506 0.68302 0.74423 0.43162 1.54716 2.04472 1.55388 0.008581 0.008581 0.008581
45 22199.3 0.93647 0.5978 1.02664 0.85879 0.63799 0.71423 0.40383 1.66299 2.1135 1.63953 0.008548 0.008548 0.008548
46 22691.4 0.92197 0.58942 1.02421 0.797 0.65324 0.71029 0.39504 1.68392 2.14661 1.66845 0.008515 0.008515 0.008515
47 23183.5 0.91698 0.5869 1.01763 0.83077 0.67284 0.73961 0.41025 1.65131 2.03503 1.65676 0.008482 0.008482 0.008482
48 23675.4 0.92353 0.59655 1.02629 0.83158 0.65257 0.73709 0.41831 1.62022 1.96755 1.62116 0.008449 0.008449 0.008449
49 24167.4 0.91057 0.57557 1.02097 0.85291 0.65257 0.73786 0.41499 1.61445 1.9023 1.61761 0.008416 0.008416 0.008416
50 24659.4 0.91734 0.57847 1.02029 0.81978 0.66723 0.72674 0.4072 1.65262 1.89212 1.64908 0.008383 0.008383 0.008383
51 25151.3 0.91401 0.57681 1.01354 0.84017 0.62261 0.69639 0.4004 1.67521 2.14709 1.63936 0.00835 0.00835 0.00835
52 25643.2 0.91475 0.56948 1.02087 0.85815 0.64241 0.72874 0.40936 1.65306 2.05931 1.65125 0.008317 0.008317 0.008317
53 26135.4 0.91871 0.57822 1.02298 0.83687 0.63078 0.71286 0.40371 1.66654 2.08474 1.66221 0.008284 0.008284 0.008284
54 26628.3 0.89818 0.56556 1.00893 0.8567 0.61647 0.71303 0.41063 1.65267 2.08126 1.64175 0.008251 0.008251 0.008251
55 27120.7 0.89358 0.55441 1.00605 0.82141 0.62779 0.70964 0.40768 1.66756 1.9433 1.65734 0.008218 0.008218 0.008218
56 27613 0.90691 0.56969 1.01456 0.82919 0.61985 0.70424 0.4078 1.66147 1.94466 1.64821 0.008185 0.008185 0.008185
57 28105.1 0.88793 0.5562 1.00082 0.83115 0.64129 0.72691 0.41578 1.6426 2.01136 1.64052 0.008152 0.008152 0.008152
58 28597.1 0.88971 0.55124 1.00704 0.8165 0.66002 0.72896 0.41491 1.62602 1.97416 1.62541 0.008119 0.008119 0.008119
59 29089.3 0.88694 0.55185 1.00787 0.82668 0.65539 0.73126 0.42254 1.60206 1.96158 1.6119 0.008086 0.008086 0.008086
60 29581.3 0.88415 0.55145 1.00124 0.82365 0.66647 0.74262 0.43137 1.58116 1.99239 1.59302 0.008053 0.008053 0.008053
61 30073.5 0.87573 0.54475 0.99821 0.84897 0.64975 0.74147 0.43363 1.56543 2.07813 1.57774 0.00802 0.00802 0.00802
62 30565.6 0.87376 0.53715 0.99721 0.83692 0.64072 0.72491 0.42398 1.5943 2.09051 1.59859 0.007987 0.007987 0.007987
63 31057.7 0.87997 0.54421 0.99879 0.82268 0.65944 0.72985 0.42851 1.58635 2.07044 1.59786 0.007954 0.007954 0.007954
64 31549.9 0.88338 0.54574 0.99647 0.83636 0.65595 0.73146 0.42972 1.58956 2.07528 1.59955 0.007921 0.007921 0.007921
65 32041.9 0.86939 0.53009 0.99902 0.82981 0.67375 0.74494 0.43487 1.56645 2.11025 1.57551 0.007888 0.007888 0.007888
66 32534 0.86888 0.52719 0.98986 0.82793 0.65877 0.72653 0.42562 1.59266 2.15916 1.60304 0.007855 0.007855 0.007855
67 33026 0.86778 0.53091 0.99213 0.81759 0.67118 0.73297 0.42696 1.58835 2.1662 1.60797 0.007822 0.007822 0.007822
68 33518 0.86234 0.52204 0.98941 0.85248 0.63283 0.72383 0.41992 1.60805 2.23317 1.63362 0.007789 0.007789 0.007789
69 34010.3 0.86981 0.5317 0.99182 0.82705 0.6554 0.72736 0.42197 1.60652 2.25644 1.6361 0.007756 0.007756 0.007756
70 34502.6 0.85367 0.52074 0.98514 0.82769 0.66103 0.73187 0.42485 1.60178 2.29991 1.63473 0.007723 0.007723 0.007723
71 34994.7 0.86654 0.52725 0.99266 0.83826 0.65764 0.73971 0.42997 1.59075 2.34217 1.62481 0.00769 0.00769 0.00769
72 35486.8 0.86605 0.51941 0.99313 0.8412 0.66328 0.7446 0.43103 1.58745 2.35336 1.62359 0.007657 0.007657 0.007657
73 35978.9 0.87283 0.52838 0.99605 0.8295 0.66328 0.74511 0.43129 1.58362 2.31974 1.6206 0.007624 0.007624 0.007624
74 36471.1 0.85658 0.52035 0.98743 0.81899 0.67118 0.74861 0.4337 1.58197 2.29316 1.61952 0.007591 0.007591 0.007591
75 36963.2 0.86084 0.50966 0.98235 0.82763 0.67174 0.75006 0.43604 1.57002 2.30223 1.60872 0.007558 0.007558 0.007558
76 37455.3 0.8446 0.50891 0.98355 0.84385 0.67058 0.7542 0.43994 1.56128 2.26985 1.60002 0.007525 0.007525 0.007525
77 37947.3 0.84701 0.51086 0.9848 0.84195 0.66779 0.75378 0.43923 1.56334 2.27815 1.60453 0.007492 0.007492 0.007492
78 38439.4 0.84858 0.51168 0.98297 0.84082 0.67118 0.75583 0.44104 1.56015 2.25969 1.60355 0.007459 0.007459 0.007459
79 38931.4 0.83738 0.5017 0.98083 0.83937 0.67491 0.75836 0.44096 1.56079 2.22401 1.6092 0.007426 0.007426 0.007426
80 39423.6 0.84266 0.50386 0.98144 0.80814 0.69607 0.7594 0.44203 1.55712 2.22288 1.60772 0.007393 0.007393 0.007393
81 39915.7 0.84255 0.50482 0.98053 0.81553 0.69261 0.7577 0.44006 1.55782 2.23303 1.61069 0.00736 0.00736 0.00736
82 40407.9 0.8407 0.5005 0.97802 0.81542 0.68923 0.75489 0.438 1.56368 2.23563 1.61614 0.007327 0.007327 0.007327

Hi @Marc!
First off, this is a great use case for Roboflow!! To help us accelerate the triage process, do you give me permission to access your workspace? Additionally, I have a couple of clarification questions to help us troubleshoot:

  • What model architecture are you using?

  • What is the range of object-of-interest and input image dimensions?

  • Are small and large objects equally represented across training and validation splits?

  • Are there specific failure modes? (eg missing small objects, false positives on large objects, etc)

Thank you for contributing to the Roboflow community!!

Hi @Ford, thanks for helping me out!

  • What model architecture are you using?
    • I’m using YOLO11L
  • What is the range of object-of-interest and input image dimensions?
    • How far it is from the camera? The object can appear anywhere from right in front of the camera to about 100 feet away.
    • All images are resized to 1280×1280, and training is done at that resolution.
  • Are small and large objects equally represented across training and validation splits?
    • Yes, I tried to ensure both small and large object instances are evenly represented across the training and validation sets.
  • Are there specific failure modes? (eg missing small objects, false positives on large objects, etc)
    • The model often misses small objects when they appear in front of complex or cluttered backgrounds. It seems to struggle with separating the object from background noise, especially when the object is farther away. The further the distance and the more complex the background, the worse the performance. On the other hand, when the object appears against a clean or simple background, it performs quite well.
      There are no major issues with false positives.

Hi @Marc!
Absolutely, always happy to help.

Thanks for the answers, it seems like the primary issue is small object detection in complex backgrounds. It’s one of the most difficult tasks in CV, but Roboflow is perfectly equipped to help you tackle it. We have a fantastic blog post that walks through Detecting Small Objects with Roboflow Workflows in depth.

Given you have already implemented tiling, I suggest implementing SAHI blocks in your workflow (discussed in the above blog post) to help improve your results. Additionally, if the foreground is known, you could detect it with instance segmentation, blur/black it out, and then train for the far objects as part of a 2 stage process.

Happy Building!!

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.