Hi,
I was wondering if I can train a model on Roboflow and then be able to export or retrieve the model file. In this case, having trained a model with Yolov5 I wanted the .pt or .pth file. Is it possible?
Hi,
I was wondering if I can train a model on Roboflow and then be able to export or retrieve the model file. In this case, having trained a model with Yolov5 I wanted the .pt or .pth file. Is it possible?
Update as of February 2025: our new Basic and Growth plans now include model weights download. Docs: Download Roboflow Model Weights | Roboflow Docs
Hi Luis,
Unfortunately, we do not release the weights files from Roboflow Train by default, but any custom model trained with our Model Zoo (models.roboflow.com) results in final weights files for export.
To train a custom YOLOv5 model: Video
To save the weights: How to Save and Load Model Weights in Google Colab
Do you have any plans to do so in the future? It would be really nice to have a completely no code solution instead of having to go outside of Roboflow and use a Google Colab notebook. I would like if we could export it as a tflite model.
We do have local deployment options: Launch: Test Computer Vision Models Locally and Inference - Object Detection - Roboflow (any option that reads āOn-Deviceā).
For Raspberry Pi deployment for example: Raspberry Pi (On Device) - Roboflow
We do take the feature request feedback. Implementation depends on the number of people requesting and also our development teamās timeline.
Being able to create a tflite model would be great!
So far I have not managed to create a tflite mode in Colab based on the Roboflow data with the instructions I have found.
The examples I found in Robflow are based on an old TF version and donāt work properly anymore
Hi, weāre working on an updated Colab notebook that works with Tensor Flow 2 for export to TFLite.
I havenāt yet fully tested it myself, but it worked with the sample data provided from Googleās original notebook.
Once we get it tested, weāll be updating the notebook to make the flow easier to follow, and including it in the Notebooks repository on GitHub.
Thank you I will try it
Hereās an updated TFLite notebook: Google Colab
The model architecture used in training is: EfficientDet
I fully tested it with a project of my own.
Hi, thank you for the great support
Iām just trying it out. I still have one question:
The image size of my images is 224224
The input size for EfficientDet-Lite1 is 640x640, do I have to adjust the image size first, or can I work with 224224?
So far Iāve created the models with Teachable Machine, which requires an image size of 224*224 for a tflite model
The following error occurs when exporting
I use tensorflow version 2.8.4
tflite_filename = 'model_Cookie.tflite'
label_filename = 'labels.txt'
vocab_filename = 'vocab.txt'
saved_model_filename = 'saved_model'
tfjs_folder_name = 'tfjs'
## available formats: tflite, tfjs, saved_model, vocab, label
#export_format = ['tflite', 'tfjs', 'saved_model'] #export in TFLite, TFJs, TF Saved Model
export_format = ['tflite']
I didnāt specify any special characters either
It would be best to use a size most-comparable to the input size of the model architecture youāre training with. Generating a new version at 640x640 is an option in Roboflow, too.
Iām not sure what underlying model Teachable Machine uses. Whatever architecture they use is what is influencing those resize parameters.
I was able to fully run the notebook and get the export without issue:
Maybe its the _Cookie
youāve included in your tflite_filename
? I also ran this notebook with Tensorflow 2.8.4 - I created the notebook while I ran through the full process last week.
And I do also want to add that you can change the EfficientDet model you use - I just happened to choose 1
in this case.
strange!
I modified the code as follows, then at least the tflite file is created.
model.export(export_dir=ā.ā,tflite_filename=tflite_filename,
label_filename=label_filename, saved_model_filename=saved_model_filename, tfjs_folder_name=tfjs_folder_name)
as soon as I add
export_format=export_format)
, the error comes.
The labels.txt is not generated.
i tried with
model.export(export_dir=ā.ā, with_metadata=False)
@staebchen0 I just pulled up the notebook to test it out again.
Maybe something changed with Colabās pre-installed libraries. Iāll report back with the results and whether I changed anything in the notebook.
Just found that Teachable Machine is using MobileNet to train image models. Theyāre using MobileNet to train a classification model on your images, and resizing to 224x224 for optimal training and inference speed.
Some other notes I wanted to add ā
Hi,
The description says that the data should be downloaded as a folder structure.
In the roboflow export there is no entry with the name āFolder Structureā
Description: How to Train MobileNetV2 On a Custom Dataset
āThen simply generate a new version of the dataset and export with a āFolder Structureā. You will recieve a Jupyter notebook command that looks something like this:ā
To receive the export type of āFolder Structure,ā youāll need to be working out of a classification project.
Are you working out of a single label classification project?
Update as of February 2025: our new Basic and Growth plans now include model weights download. Docs: Download Roboflow Model Weights | Roboflow Docs
This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.