{
error: {
error: {
message: 'No trained model was found for this model version.',
type: 'GraphMethodException',
hint: 'You must train a model on this version with Roboflow Train before you can use inference.',
e: [Array]
}
}
}
but in this case the docker container raises an error:
TypeError: req.body.replace is not a function
at transformImageBody (/inference-server/server/index.js:323:26)
at Layer.handle [as handle_request] (/inference-server/server/node_modules/express/lib/router/layer.js:95:5)
at next (/inference-server/server/node_modules/express/lib/router/route.js:144:13)
at Route.dispatch (/inference-server/server/node_modules/express/lib/router/route.js:114:3)
at Layer.handle [as handle_request] (/inference-server/server/node_modules/express/lib/router/layer.js:95:5)
at /inference-server/server/node_modules/express/lib/router/index.js:284:15
at param (/inference-server/server/node_modules/express/lib/router/index.js:365:14)
at param (/inference-server/server/node_modules/express/lib/router/index.js:376:14)
at param (/inference-server/server/node_modules/express/lib/router/index.js:376:14)
at Function.process_params (/inference-server/server/node_modules/express/lib/router/index.js:421:3)
Hi, did you resolve this issue? I’m getting the same error now on a model I trained in a Colab Notebook and deployed back to Roboflow. The deployed model is available on my page but when I try to use it in my web app, I get this error:
{
"error": {
"message": "No trained model was found for this model version.",
"type": "GraphMethodException",
"hint": "You must train a model on this version with Roboflow Train before you can use inference.",
"e": [
"Could not parse size from model.json"
]
}
}
Hi @stellashphere, no I’m just pulling it directly into a web page on my local test rig for now. I retrained the model on a new version and the issue is now resolved. No idea what went wrong with the previous version.