Luxonis Oak inference JSON error, host device is Jetson Nano

Host device: Jetson Nano, L4T R32.7.4, python 3.6.9

I installed depthai, opencv-python, and roboflowoak:
depthai ==2.25.0.0
opencv-python == 4.9.0.80
roboflowoak == 0.0.12

Hello, I’m trying to deploy a custom model for luxonis oak inference, but I’m getting a json error when instantiating the rf object.

  line 61, in __init__
      self.model_objects = json.loads(f.read())
    File "/usr/lib/python3.6/json/__init__.py", line 354, in loads
      return _default_decoder.decode(s)
    File "/usr/lib/python3.6/json/decoder.py", line 339, in decode
      obj, end = self.raw_decode(s, idx=_w(s, 0).end())
    File "/usr/lib/python3.6/json/decoder.py", line 357, in raw_decode
      raise JSONDecodeError("Expecting value", s, err.value) from None
  json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

I’m using the code below provided by roboflow (ommitted model name and api key in post).

        from roboflowoak import RoboflowOak
        import cv2
        import time
        import numpy as np
        
        if __name__ == '__main__':
            # instantiating an object (rf) with the RoboflowOak module
            rf = RoboflowOak(model="____", confidence=0.05, overlap=0.5,
            version="1", api_key="____", rgb=True,
            depth=True, device=None, blocking=True)

This code works on a different Jetson Nano, which has the same versions of L4T, depthai, opencv-python, and roboflowoak. Unfortunately I no longer have access to that other Jetson Nano.

Has anyone else encountered this issue? Any help/suggestions would be appreciated! Thank you

Hi @tde - the Luxonis OAK is a separate edge device. Either you should use it as compute (and ditch the Jetson) or you should just use it to pass frames (in which you don’t need to use the roboflowoak package).

If you’re just trying to run inference on a jetson, try the inference repo.

Unfortunately I cannot abandon the Jetson because my group has already integrated the Jetson into other aspects of our project. We are using the oak to compute and run inference as it is simpler and has better thermal performance than using the Jetson for inference for our use case. This exact code works with the same Luxonis Oak when hosted on a different Jetson nano, but we no longer have access to that second Jetson.

Do you know what could be causing the JSON issue?

This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.