Running Trained Model on GPU

Hi! I am trying to run an object detection model that I trained on the Roboflow online app on a Jetson NX using JetPack 6. Unfortunately, when I attempt to install the inference-gpu library, I am met with an onnxruntime-gpu library dependency issue, which will not resolve itself. I have tried uninstalling and reinstalling the correct onnxruntime-gpu version using its wheel before installing inference-gpu, and I receive the same dependency error. I am unable to provide a screenshot of the error at this time, but if necessary I can add it in the comments. Thanks for your help in advance!

Could you post the install commands your running and error messages they produce?

Yes, here they are. I should also mention that I am trying to run it in a docker container, so these install commands are in a Dockerfile. The issue seems to be that onnxruntime-gpu is only available in version 1.17.0, but inference-gpu depends on onnxruntime-gpu<=1.15.1. I have tried uninstalling and reinstalling onnxruntime-gpu, and installing it from wheel, but nothing seems to work. Thank you for your help and quick response!

RUN pip install roboflow
RUN pip install “onnxruntime-gpu<1.15.1”
RUN pip install inference-gpu
RUN pip install supervision

=> ERROR [ptz-new 4/8] RUN pip install “onnxruntime-gpu<1.15.1” 1.7s

[ptz-new 4/8] RUN pip install “onnxruntime-gpu<1.15.1”:
1.140 Looking in indexes: jp6/cu122 index
1.445 ERROR: Could not find a version that satisfies the requirement onnxruntime-gpu<1.15.1 (from versions: 1.17.0)
1.445 ERROR: No matching distribution found for onnxruntime-gpu<1.15.1


failed to solve: process “/bin/bash -c pip install "onnxruntime-gpu<1.15.1"” did not complete successfully: exit code: 1

If I do install the available version of onnxruntime-gpu, I receive this error, which makes sense:

77.58 ERROR: Cannot install inference-gpu==0.10.0, inference-gpu==0.11.0, inference-gpu==0.11.1, inference-gpu==0.11.2, inference-gpu==0.12.0, inference-gpu==0.12.1, inference-gpu==0.13.0, inference-gpu==0.14.0, inference-gpu==0.14.1, inference-gpu==0.15.0, inference-gpu==0.15.1, inference-gpu==0.7.2, inference-gpu==0.7.6, inference-gpu==0.8.0, inference-gpu==0.8.1, inference-gpu==0.8.2, inference-gpu==0.8.4, inference-gpu==0.8.5, inference-gpu==0.8.8, inference-gpu==0.8.9, inference-gpu==0.9.0, inference-gpu==0.9.1, inference-gpu==0.9.10, inference-gpu==0.9.11, inference-gpu==0.9.12, inference-gpu==0.9.13, inference-gpu==0.9.14, inference-gpu==0.9.15, inference-gpu==0.9.16, inference-gpu==0.9.17, inference-gpu==0.9.18, inference-gpu==0.9.2, inference-gpu==0.9.20, inference-gpu==0.9.22, inference-gpu==0.9.23, inference-gpu==0.9.3, inference-gpu==0.9.4, inference-gpu==0.9.5, inference-gpu==0.9.6, inference-gpu==0.9.7, inference-gpu==0.9.8 and inference-gpu==0.9.9 because these package versions have conflicting dependencies.
77.58
77.58 The conflict is caused by:
77.58 inference-gpu 0.15.1 depends on onnxruntime-gpu<=1.15.1
77.58 inference-gpu 0.15.0 depends on onnxruntime-gpu<=1.15.1
77.58 inference-gpu 0.14.1 depends on onnxruntime-gpu<=1.15.1
77.58 inference-gpu 0.14.0 depends on onnxruntime-gpu<=1.15.1
77.58 inference-gpu 0.13.0 depends on onnxruntime-gpu<=1.15.1
77.58 inference-gpu 0.12.1 depends on onnxruntime-gpu<=1.15.1
77.58 inference-gpu 0.12.0 depends on onnxruntime-gpu<=1.15.1
77.58 inference-gpu 0.11.2 depends on onnxruntime-gpu<=1.15.1
77.58 inference-gpu 0.11.1 depends on onnxruntime-gpu<=1.15.1
77.58 inference-gpu 0.11.0 depends on onnxruntime-gpu<=1.15.1
77.58 inference-gpu 0.10.0 depends on onnxruntime-gpu<=1.15.1
77.58 inference-gpu 0.9.23 depends on onnxruntime-gpu<=1.15.1
77.58 inference-gpu 0.9.22 depends on onnxruntime-gpu<=1.15.1
77.58 inference-gpu 0.9.20 depends on onnxruntime-gpu<=1.15.1
77.58 inference-gpu 0.9.18 depends on onnxruntime-gpu<=1.15.1
77.58 inference-gpu 0.9.17 depends on onnxruntime-gpu<=1.15.1
77.58 inference-gpu 0.9.16 depends on onnxruntime-gpu<=1.15.1
77.58 inference-gpu 0.9.15 depends on onnxruntime-gpu<=1.15.1
77.58 inference-gpu 0.9.14 depends on onnxruntime-gpu<=1.15.1
77.58 inference-gpu 0.9.13 depends on onnxruntime-gpu<=1.15.1
77.58 inference-gpu 0.9.12 depends on onnxruntime-gpu<=1.15.1
77.58 inference-gpu 0.9.11 depends on onnxruntime-gpu<=1.15.1
77.58 inference-gpu 0.9.10 depends on onnxruntime-gpu<=1.15.1
77.58 inference-gpu 0.9.9 depends on onnxruntime-gpu<=1.15.1
77.58 inference-gpu 0.9.8 depends on onnxruntime-gpu<=1.15.1
77.58 inference-gpu 0.9.7 depends on onnxruntime-gpu<=1.15.1
77.58 inference-gpu 0.9.6 depends on onnxruntime-gpu<=1.15.1
77.58 inference-gpu 0.9.5 depends on onnxruntime-gpu<=1.15.1
77.58 inference-gpu 0.9.4 depends on onnxruntime-gpu<=1.15.1
77.58 inference-gpu 0.9.3 depends on onnxruntime-gpu<=1.15.1
77.58 inference-gpu 0.9.2 depends on onnxruntime-gpu<=1.15.1
77.58 inference-gpu 0.9.1 depends on onnxruntime-gpu<=1.15.1
77.58 inference-gpu 0.9.0 depends on onnxruntime-gpu<=1.15.1
77.58 inference-gpu 0.8.9 depends on onnxruntime-gpu<=1.15.1
77.58 inference-gpu 0.8.8 depends on onnxruntime-gpu<=1.15.1
77.58 inference-gpu 0.8.5 depends on onnxruntime-gpu<=1.15.1
77.58 inference-gpu 0.8.4 depends on onnxruntime-gpu<=1.15.1
77.58 inference-gpu 0.8.2 depends on onnxruntime-gpu<=1.15.1
77.58 inference-gpu 0.8.1 depends on onnxruntime-gpu<=1.15.1
77.58 inference-gpu 0.8.0 depends on onnxruntime-gpu<=1.15.1
77.58 inference-gpu 0.7.6 depends on onnxruntime-gpu<=1.15.1
77.58 inference-gpu 0.7.2 depends on onnxruntime-gpu<=1.15.1
77.58
77.58 To fix this you could try to:
77.58 1. loosen the range of package versions you’ve specified
77.58 2. remove package versions to allow pip attempt to solve the dependency conflict
77.58
77.58 ERROR: ResolutionImpossible: for help visit Dependency Resolution - pip documentation v24.2.dev0

failed to solve: process “/bin/bash -c pip install inference-gpu” did not complete successfully: exit code: 1

For additional context, by installing onnxruntime-gpu “using its wheel” I mean using these commands, which still results in the “depends on onnxruntime-gpu<=1.15.1” error:

RUN wget https://nvidia.box.com/shared/static/iizg3ggrtdkqawkmebbfixo7sce6j365.whl -O onnxruntime_gpu-1.15.1-cp310-cp310-linux_aarch64.whl
RUN pip install onnxruntime_gpu-1.15.1-cp310-cp310-linux_aarch64.whl

Hi Kalena,

I’m looking into this some more and it looks like we don’t have good support for JetPack 6 yet.

I will be looking into how to make this work with JetPack 6 later this week and will keep you updated with what we figure out.

In case its helpful we do have prebuilt docker images for jetson with JetPack 4.6 and 5.1 available on docker hub:

That’s what I figured. Do you think I could possibly build onnxruntime-gpu version 1.15.1 from source and copy that into my docker container? Would inference-gpu recognize that? Regardless, let me know what you come up with, and thanks again!

This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.