Project Type: Instance Segmentation
Operating System & Browser: Windows 10 / Google Chrome
Project Universe Link or Workspace/Project ID: fungal-growth-boundary-detection
Hi everyone,
I am a 3rd year biology degree student working on an instance segmentation project for my dissertation, to detect hand-drawn measurement boundary lines on Petri dishes. The goal is to automate fungal growth area measurements by identifying the drawn lines and plate edge.
A key requirement is that the model must not only detect the boundaries but also recognise their order from the centre of the plate outward (e.g., “1st measurement” is the innermost, “2nd measurement” is next, and so on). Additionally, I need the model to enable precise area measurements for each enclosed region.
What I’ve Done So Far:
- Collected and annotated 20ish images of solo plates (60mm) with drawn measurement lines.
- Labelled each growth boundary (e.g., “1st measurement”, “2nd measurement”, etc.) and plate edge.
- Applied contrast stretching as preprocessing to enhance faint lines.
- Trained an Instance Segmentation model (Roboflow 3.0, MS COCO checkpoint).
- Achieved 90% mAP, 77.7% precision, 64.3% recall on the initial test (10 images).
- Ran test predictions—detected most lines but struggled with plate edges and some faint/overlapping lines.
Next Steps:
- Adding more annotated images (5-10) to improve detection of plate edges and faint/overlapping lines.
- Considering whether tags (e.g., “Handwriting Present”, “Faded Ink”) might help model performance.
- Exploring ways to refine the dataset or adjust training settings (e.g., confidence thresholds, augmentations, more epochs, etc.).
Questions for the Forum:
- How can I ensure the model understands the order of boundaries from the plate centre outward rather than treating them as separate, unranked objects?
- Is there a better preprocessing technique (besides contrast stretching) to make the drawn lines stand out more?
- Are there specific techniques that might improve detection of faint or overlapping lines in an instance segmentation model?
- Any advice on extracting precise area measurements from each enclosed region after detection?
- Any strategies for handling text/handwriting on the plates so it does not interfere with detection?
Sorry that’s long, but any help will be greatly appreciated.
Sincerely,
A struggling biology student.
Example image: