I have currently set up an NVIDIA Jetson Nano with Roboflow inference server by following this document. The Jetson Nano is flashed with JetPack 4.6.
When I try to run the below example on an image, it works well:
base64 YOUR_IMAGE.jpg | curl -d @- \ "http://localhost:9001/your-model/42?api_key=YOUR_KEY"
Then I try to run the below example to do webcam inference:
However, it returns the following error:
Traceback (most recent call last): File "infer-simple.py", line 71, in <module> cv2.imshow('image', image) cv2.error: OpenCV(4.5.5) /io/opencv/modules/highgui/src/window.cpp:1000: error: (-215:Assertion failed) size.width>0 && size.height>0 in function 'imshow'
The output log on Jetson Nano shows that the model is loaded:
initializing... inference-server is ready to receive traffic. Downloading weights for yellow-flowers/1 requesting from roboflow server... Weights downloaded in 2.13 seconds Initializing model... 2022-03-31 11:45:30.612895: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2022-03-31 11:45:35.930139: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10.0 This model execution did not contain any nodes with control flow or dynamic output shapes. You can use model.execute() instead. Model prepared in 9.24 seconds
I have already replaced “line 22” of the code I shared before (infer-simple.py) to point to the Jetson inference server.
However, when I change “line 22” back to use Roboflow cloud server, the example runs well.
I have also tried to print out “infer()” for debugging in the above code:
And it printed “None”, which means result image was not parsed.
Please advise on how to resolve this issue.