I’m new to Roboflow and before starting to dig too far into it, I’d like to see if it can fulfill my needs.
I’m looking at developing an application that can detect subtle motion for few seconds on a fixed scene (part of the scene are fixed points) and send a trigger to an external application for further actions. The detection has to be done real time, ideally using a video stream coming from a webcam or a phone.
Are you looking for any delta in motion between frames, or a specific shift in a known object? Any example images you’re comfortable sharing would be great.
We support vision models running on live streams (phone or webcam would both work) - best technical path will be dependent on the specific problem you’re trying to solve.
Here is a picture overlapping 2 video frames where you can see the delta I’m trying to detect on the eyelashes in reference to the eye / face. This is one example but several hairs are moving at the same time. Motion happens across several frames; a back and forth takes roughly 1sec.
I’d probably look at something like Optical Flow for this as a starting point. But there are some newer approaches like the “Track Everything Everywhere All at Once” paper (and several newer ones inspired by it) that could also be interesting to look at (though I don’t think I’ve seen any that run in real time).
Both of these are on our todo list for Roboflow Workflows blocks but not sure when they’ll bubble up to the top (probably once a large customer needs them).
I took a look at Optical Flow and it seems to detect the useful information on the video stream! However, the next thing I’ll have to do is find a way to stabilize the video before running optical flow on it. Ideally, I’d like to define some zones that I consider static so the motion due to non stabilized acquisition can be filtered prior to using optical flow.
I’d recommend training a keypoint model based on elements with a static relation to the camera - you could then use those coordinates for stabilization.