Confusion matrix generating confusing math

Hi there,
I’ve been building a model that recognizes different types of produce in a tray using yolo v8. I’m referencing the confusion matrix (normalized) to monitor the performance of the model, and I’m a little confused by some of the results I’m seeing.

I understand the diagonal line to be the “how often is it accurately predicting each class” (which is going pretty well except for bananas), but I don’t really understand why all the squares in each row/column don’t add up to 1.

For example, Bananas are .89 for Bananas, .02 for lemons, and .44 for background. That seems to add up to 135%?

Or kiwis are .98 for kiwis, .01 for oranges, and .05 for Background, which adds up to 104%?

A second, I suspect related, question: I think I really don’t understand the whole background column and row. Is this representing objects the model thinks are part of the background and parts of the background it thinks are objects?

My confusion matrix here:

Thank you!

Hi @happydud!

First, I recommend our confusion matrix (built into the app, you can easily click into the images that make up each category).

Second, here’s a good blog that goes into how these values are calculated.

That doesn’t really answer either of my questions.

I’m not sure how the values are calculated; it might be useful to post an issue to the Ultralytics repo?

Hey @happydud

Our blog post on what confusion matrices are might be helpful for understanding how they work.