I have trained a custom classification model based on vision transformers ViT. After downloaded the model weights(.pt) I tried loading it using regular PyTorch load method but I got module src not found.
What is the best option to load weights and make predictions using ViT based model?
I see – I’m guessing it is something to do with your pytorch installation. We highly recommend using https://inference.roboflow.com/ to infer on models trained in Roboflow!
It could be but I dont think so as I am loading few other external models with Pytorch successfully.
Moreover, the documentation about Inference Python Module is confusing. I did not find how I can authenticate it using API-KEY. If you please can point me out where I can find it, it would be great.
Absolutely! Here is the documentation page on authenticating with the Roboflow API Key. The best way is to set it as an environment variable! Retrieve Your API Key - Roboflow Inference
However, is there any option to run inference completely off-line rather than using inference server in the background? The reason is: even using GPU it is slow and after hundreds of predictions it is throwing a 500 Internal Server Error.
I mean I just want to load trained model, call whatever is predict functions and get the results.
Regarding the issue “module not found src”, it seems the .pt weight files has serialized the folder structure as well. After some trial and error I got the following message: “src.classification.timm_model.ImageClassifier” needs to be added as Safe Globals.
The issue is: I don’t have the source code for src.classification.timm_model.ImageClassifier class neither I have the model architecture saved.
So the question is: is it a bug while saving model weight file? If not, how can I get model architecture and create a workaround for it?
using model.infer is much faster than roboflow module.
I had problems before because I was using model.predict and I had to perform many array operations before pass it to the method.
By the way, to you have any idea I can’t load model into PyTorch using torch.load()?