`inference.get_model()` from a local file

the inference.get_model() documentation (Roboflow - Roboflow Inference), it is possible to load a model from RoboFlow universe, but not local file

for reference what i want to do is something like this:

from rfdetr import RFDETRMedium
from inference import get_model

model = RFDETRMedium(resolution=960)
model.export()

get_model("./output/inference_model.onnx")

(this is continuation of my previous post Changing Inference Resolution with roboflow/inference for Models like rf-detr - #2 by Ford)

Hi @Kallinteris-Andreas,

At this point in time, the inference package is not providing this feature. Currently you can check this documented env variable to get your locally-stored model loaded as if it was cached (keeping in mind licensing).

Hope this helps,
Grzegorz

1 Like

This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.