Problems when parsing model metadata using inferencejs

  • **Project Type: Instance segmentation
  • **Operating System & Browser: Windows, Firefox
  • **Project Universe Link or Workspace/Project ID: Sign in to Roboflow

Hello everyone. I am currently running into this error when I try to run my model through inferencejs: Error: Error loading model spectrogram-4ah9q/instance-segmentation-for-images/2: Error: Error happened while parsing model metadata. {}
Below is the code I use for inference. I custom train my model locally then I upload my trained model to roboflow using deploy function in roboflow in python.
Has anyone ever run into this problem before and how one can solve it?

Unfortunately, inference.js doesn’t currently support instance-segmentation models. Try hitting our hosted API with a post request, or standing up your own api in a couple commands using our inference docker container

pip install inference-cli
inference server start

This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.