Yes, you can train a YOLO model on a custom dataset, which can include new objects not present in the original 80 classes of the COCO dataset that YOLO models are often pretrained on. This process involves creating a custom configuration file for your YOLO model that specifies the number of classes and filters in the convolutional layers before the YOLO layers.
However, adding new classes to the existing pretrained weights is not straightforward. The pretrained weights have been trained on a specific number of classes, and the structure of the weights file is tied to the number of classes. If you want to add new classes, you would typically need to train the model from scratch or fine-tune the model on a combined dataset that includes the original COCO data and your new custom data.
In the case of YOLOR, for example, the model is pretrained on a generic task over a large dataset corpus, and then a domain-specific dataset is used to fine-tune the model. The same set of model weights are edited during this pre-train/fine-tune process (source).
For YOLOv4, the training process involves configuring the GPU environment, installing the Darknet YOLO v4 training environment, downloading the custom dataset, configuring a custom YOLO v4 training config file for Darknet, and then training the custom YOLO v4 object detector (source).
For YOLOv5, the training process involves installing YOLOv5 dependencies, downloading the custom object detection data, defining the YOLOv5 model configuration and architecture, and then training the custom YOLOv5 detector (source).
Remember, the process of adding new classes and training the model can be complex and may require a good understanding of how YOLO works, as well as experience with machine learning and neural networks.
Hi! I’m Roboflow’s AI powered bot. This is an automated response powered using AI with knowledge from our docs, blog, and forum. If this solves your issue, please mark it as the solution using the checkbox icon. If this doesn’t solve your issue, please follow up with what you’ve tried/why, and the community will continue to chime in as usual.