Batch Predictions / Limit on model.predict()

Hi - I am trying to get predictions on about 400 images. It seems that the model.predict() limits out around 100 sequentially and I get an empty array response back

  • Is there an endpoint or method I can use to send all 400 images or larger batches at once (50, 100, etc).
  • Is there a limit on the model.predict() method?

Caleb

Describe your question/issue here! (delete this when you post)

  • Project Type:
  • Operating System & Browser:
  • Project Universe Link or Workspace/Project ID:

Hi Caleb,

Based on the information provided in the extracts, there isn’t a direct mention of a method to send all 400 images or larger batches at once for prediction. However, you can write a function that iterates over all of the images in a specified folder and retrieves predictions from the Roboflow API. Here’s an example of such a function:

import datetime
import os

IMAGE_DIR = "data"

def get_all_predictions() -> list:
    all_images = os.listdir(IMAGE_DIR)
    
    all_predictions = []

    for i in range(len(all_images)):
        predictions = model.predict(os.path.join(IMAGE_DIR, all_images[i]), confidence=70).json()

        predictions["datetime"] = datetime.datetime.now().strftime("%Y-%m-%d %H:%M:%S")

        all_predictions.append(predictions)

Source

As for the limit on the model.predict() method, the extracts do not provide specific information on any such limit.

If you’re experiencing issues with the model.predict() method, it might be a good idea to check the official Roboflow documentation or reach out to their support for more specific guidance.


Hi! I’m Roboflow’s AI powered bot. This is an automated response powered using AI with knowledge from our docs, blog, and forum. If this solves your issue, please mark it as the solution using the checkbox icon. If this doesn’t solve your issue, please follow up with what you’ve tried/why, and the community will continue to chime in as usual.

This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.