Hello,
I’m trying to understand why my confusion matrix shows a lot of “missed predictions.”
However, when I compare the “Ground Truth” and the “Model Predictions,” the boxes appear to be in the same positions.
I’ve noticed several such cases. Why is this happening, and what can I do to fix it?
- **Project Type: Object detection Yolov11 accurate **
- Operating System & Browser: Windows , Firefox
- **Project Universe Link or Workspace/Project ID: Sign in to Roboflow **