I'm trying to run inference on an IP camera stream

Hello, I’m trying to run inference on an IP camera stream by using the instructions here:

My code is below. I get the following error when creating inference.Stream. Could you please help me understand why?

File “/home/administrator/rtsp_roboflow_test.py”, line 20, in
inference.Stream(
File “/home/administrator/.local/lib/python3.10/site-packages/inference/core/interfaces/stream/stream.py”, line 176, in init
self.run_thread()
File “/home/administrator/.local/lib/python3.10/site-packages/inference/core/interfaces/stream/stream.py”, line 332, in run_thread
self.inference_request_thread()
File “/home/administrator/.local/lib/python3.10/site-packages/inference/core/interfaces/stream/stream.py”, line 273, in inference_request_thread
predictions = self.model.make_response(
File “/home/administrator/.local/lib/python3.10/site-packages/inference/core/models/object_detection_base.py”, line 121, in make_response
responses = [
File “/home/administrator/.local/lib/python3.10/site-packages/inference/core/models/object_detection_base.py”, line 123, in
predictions=[
File “/home/administrator/.local/lib/python3.10/site-packages/inference/core/models/object_detection_base.py”, line 124, in
ObjectDetectionPrediction(
File “/home/administrator/.local/lib/python3.10/site-packages/pydantic/main.py”, line 164, in init
pydantic_self.pydantic_validator.validate_python(data, self_instance=pydantic_self)
pydantic_core._pydantic_core.ValidationError: 1 validation error for ObjectDetectionPrediction
class_confidence
Field required [type=missing, input_value={‘x’: 1776.5, ‘y’: 247.5,…: ‘face’, ‘class_id’: 1}, input_type=dict]
For further information visit Redirecting...

code:

import cv2
import inference
import supervision as sv

annotator = sv.BoxAnnotator()

def render(predictions, image):
print(predictions)

inference.Stream(
source=“rtsp://192.168.131.11:554/axis-media/media.amp”, # PTZ Camera IP
model=“gauges-sfqxy/4”,
output_channel_order=“BGR”,
on_prediction=render
)

Hi @Fremont

Sorry to hear you’re having problems. If you do experience issues with our local inference package, please file a GitHub issue on the Inference Repo.

This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.