- Instance Segmentation
- Windows Microsoft Edge
Hi,
I have a requirement to calculate mussel size from the image. I am using scale in my reference image and calculating the dimensions using below function.
function calculateRealWorldDimensions(pixelWidth, pixelHeight, scaleWidthPx, scaleHeightPx, scaleWidthMm, scaleHeightMm) {
const widthRatio = scaleWidthMm / scaleWidthPx; // Scale factor for width (mm per pixel)
const heightRatio = scaleHeightMm / scaleHeightPx; // Scale factor for height (mm per pixel)
const realWidth = pixelWidth * widthRatio; // Real-world width in mm
const realHeight = pixelHeight * heightRatio; // Real-world height in mm
return { realWidth, realHeight };
}
-
I am getting approx correct dimensions when the mussel’s longest side is parallel to scale.(left part of image)
-
But when mussel’s longest side is at some angle to scale (i.e not parallel to scale) then the dimension are not accurate.(right part of image)
Is there a way I can find accurate dimensions even when mussels are placed at some angle to scale?
Any help would be appreciated.
Thanks,
Payal
Hi @Payal_Bhuva ,
I think there is not much we can do if mussels are detected with object detection model. However, if you can switch to instance segmentation model then you could try workflows with minimal bounding box block
Hope this helps,
Grzegorz
Hi @Grzegorz ,
Thank you for your reply.
My project is instance segmentation model. I tried rectangle bounding box. It works fine when mussel is parallel to scale but when mussel is at an angle then its shows output as below.
Is there any when I could get the longest length (Tip to Tip)?
Thanks,
Payal
Hi @Payal_Bhuva ,
Can you try if Size Measurement block will produce expected results? You can filter out reference object using Detections Filter, this block accepts reference dimensions as configuration parameter. If measured objects are coming from instance segmentation model then minimal bounding rect should be automatically taken for measurement (so you won’t need minimal bounding rect block). Your workflow would look similar to below:
Hope this helps,
Grzegorz
Hi @Grzegorz ,
Thank you so much. I was able to to find measurement using this approach 
Also, how can we increase the accuracy of dimensions? I can see 5-6mm difference in actual size. Mostly when the mussels are place away from scale/ they are placed diagonal (not parallel) to scale.
Appreciate your help!
Regards,
Payal
1 Like
Hi @Payal_Bhuva ,
With today’s release of inference you will be able to visualize minimal bounding rectangle, this should hopefully help you narrowing down the root cause of the problem.
Have a look at below workflow:
Bounding Rectangle
results in updated Detections
object, where bounding rectangle can now be visualized with Polygon Visualization
. Example output of above workflow (I was playing with my chess pieces model)
This should allow you to inspect if bounding rectangle fits mussel correctly.
On another note - looking at your photos from original post it seems you are taking photos directly from the above, so hopefully there is no perspective that would result in skewed dimensions. However if you do suspect there is perspective at play you might want to apply perspective correction block.
Hope this helps,
Grzegorz
Hi @Grzegorz,
Thank you for all your help. I tried to create workflow similar to you suggestion and it does give minimal bounding rectangle correctly now. I think the issue with accuracy might be perspective of image. Can you please show me where should I add perspective block? I was trying to add one but was facing few issues.
Thanks,
Payal
Hi Payal,
With regards to accuracy - you notice inaccurate measurement for all mussels or only for mussels that are positioned in such way that they are not parallel to the scale? (I suspect both)
Are you taking photos from hand or is the camera mounted on tripod (so all photos will have same distance to mussels and scale)?
I’d check following:
- is the
Reference Dimensions
parameter set correctly - i.e. model probably detects whole ruler, so you would probably have to measure real length of whole ruler and not only the scale - this is the first parameter you can look at
- I think you will get better accuracy for camera further away from scene - when the camera is closer I think you might start experiencing distortions and as a result projection of pixels to real world dimension is non linear - when your camera is placed closer to the scale you can notice the length of 1cm taken from the very left of the scale is different than length of 1cm taken from the middle of the scale (located directly at the center of the lens).
If you are taking photos from camera mounted to stand or tripod, you can attempt to calibrate the system by measuring object of known size and adjust Reference Dimensions
until measured size is the same. If you have time, you can then measure such known object by placing it at various spots - i.e. at the very center, in the corners and observe deviation.
I’d also have one more question - in your deployment, do you use one camera or would you have multiple different cameras?
Grzegorz