Live Inference video in the browser

Hello,

I am trying to perform inference in the browser using a webcam as the source, but I am struggling to find a working solution. Do you have any ideas or guidelines on how to achieve this
loading a onnx model.

Additionally, I would like to add bounding boxes around detected objects as a subsequent step.

Thank you for your time

Hello! There are a few ways to use a webcam with Roboflow models.

Hello thank you so much, but i am looking a solution in Edge, everything in local

No problem, you can run all Roboflow models locally using Inference https://inference.roboflow.com/

It works with python thank you, but my idea is to do the inference on the browser in local.
localhost:3000/inference.html
and see my webcam with the bouding box , just loadin the onxx or other format using javascript

thanks

Hey @Andrea_Gelsomino

There are several ways to run inferences both hosted and locally.

The resource Trevor shared in his first answer, roboflow.js, runs your model locally in your browser. That will return your model inference results in a JSON format.

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.